Apr 17 09:08:35.064210 ip-10-0-128-212 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 09:08:35.064222 ip-10-0-128-212 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 09:08:35.064247 ip-10-0-128-212 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 09:08:35.064461 ip-10-0-128-212 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 09:08:45.193960 ip-10-0-128-212 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 09:08:45.193979 ip-10-0-128-212 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8179f3c6648849b08d93032efe4916b5 -- Apr 17 09:11:06.569416 ip-10-0-128-212 systemd[1]: Starting Kubernetes Kubelet... Apr 17 09:11:06.961049 ip-10-0-128-212 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:11:06.961049 ip-10-0-128-212 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 09:11:06.961049 ip-10-0-128-212 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:11:06.961049 ip-10-0-128-212 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 09:11:06.961049 ip-10-0-128-212 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:11:06.961986 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.961901 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 09:11:06.964097 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964081 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:06.964097 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964097 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964101 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964104 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964107 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964113 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964117 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964120 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964123 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964126 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964128 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964147 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964151 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964155 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964169 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964173 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964175 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964178 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964181 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964184 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964186 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:06.964185 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964189 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964192 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964197 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964200 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964204 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964207 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964210 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964213 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964216 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964218 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964221 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964224 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964226 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964229 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964231 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964234 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964237 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964242 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964245 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:06.964894 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964248 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964250 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964253 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964256 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964258 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964261 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964263 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964266 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964268 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964271 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964273 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964276 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964278 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964281 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964283 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964287 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964290 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964293 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964296 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964298 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:06.965579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964301 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964303 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964306 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964309 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964312 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964315 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964317 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964322 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964324 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964327 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964330 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964333 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964336 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964338 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964341 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964344 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964347 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964349 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964352 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964354 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:06.966302 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964364 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964366 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964369 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964372 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964375 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964377 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964739 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964744 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964747 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964750 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964753 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964756 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964759 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964761 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964764 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964767 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964769 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964772 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964774 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964777 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:06.966980 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964780 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964782 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964785 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964788 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964791 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964794 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964796 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964799 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964801 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964804 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964807 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964810 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964815 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964818 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964821 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964824 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964827 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964830 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964833 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:06.967687 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964837 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964839 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964842 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964845 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964849 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964851 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964854 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964857 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964859 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964862 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964866 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964869 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964872 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964874 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964876 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964879 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964882 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964885 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964888 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964890 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:06.968376 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964893 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964896 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964898 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964901 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964904 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964907 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964909 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964912 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964915 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964917 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964920 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964922 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964925 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964928 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964930 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964933 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964937 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964939 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964941 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964944 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:06.969073 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964946 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964949 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964952 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964954 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964957 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964959 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964962 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964964 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964967 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964970 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964973 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964976 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.964979 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966419 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966429 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966436 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966441 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966445 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966449 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966454 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966458 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966461 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 09:11:06.969792 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966465 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966468 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966471 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966475 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966478 2567 flags.go:64] FLAG: --cgroup-root="" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966481 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966484 2567 flags.go:64] FLAG: --client-ca-file="" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966487 2567 flags.go:64] FLAG: --cloud-config="" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966490 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966493 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966496 2567 flags.go:64] FLAG: --cluster-domain="" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966499 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966502 2567 flags.go:64] FLAG: --config-dir="" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966505 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966508 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966512 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966516 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966519 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966522 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966525 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966528 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966532 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966535 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966538 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966543 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 09:11:06.970585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966546 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966549 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966552 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966555 2567 flags.go:64] FLAG: --enable-server="true" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966558 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966562 2567 flags.go:64] FLAG: --event-burst="100" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966566 2567 flags.go:64] FLAG: --event-qps="50" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966569 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966572 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966575 2567 flags.go:64] FLAG: --eviction-hard="" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966579 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966582 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966586 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966589 2567 flags.go:64] FLAG: --eviction-soft="" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966592 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966595 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966597 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966600 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966603 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966606 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966609 2567 flags.go:64] FLAG: --feature-gates="" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966613 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966616 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966619 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966622 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966626 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 17 09:11:06.971467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966629 2567 flags.go:64] FLAG: --help="false" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966632 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-128-212.ec2.internal" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966655 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966663 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966666 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966670 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966674 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966677 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966680 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966683 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966686 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966689 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966692 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966695 2567 flags.go:64] FLAG: --kube-reserved="" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966698 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966701 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966705 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966708 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966711 2567 flags.go:64] FLAG: --lock-file="" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966713 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966716 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966719 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966725 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 09:11:06.972422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966728 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966731 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966734 2567 flags.go:64] FLAG: --logging-format="text" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966736 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966740 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966743 2567 flags.go:64] FLAG: --manifest-url="" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966746 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966750 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966754 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966757 2567 flags.go:64] FLAG: --max-pods="110" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966760 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966764 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966767 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966770 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966774 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966776 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966779 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966787 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966790 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966793 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966796 2567 flags.go:64] FLAG: --pod-cidr="" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966799 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966805 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966807 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 09:11:06.973224 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966811 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966814 2567 flags.go:64] FLAG: --port="10250" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966817 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966820 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00eba85f519904ff0" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966823 2567 flags.go:64] FLAG: --qos-reserved="" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966827 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966830 2567 flags.go:64] FLAG: --register-node="true" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966833 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966836 2567 flags.go:64] FLAG: --register-with-taints="" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966839 2567 flags.go:64] FLAG: --registry-burst="10" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966842 2567 flags.go:64] FLAG: --registry-qps="5" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966845 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966848 2567 flags.go:64] FLAG: --reserved-memory="" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966854 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966857 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966861 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966864 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966867 2567 flags.go:64] FLAG: --runonce="false" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966869 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966873 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966876 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966880 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966883 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966886 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966889 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966892 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 09:11:06.974063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966896 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966898 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966901 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966904 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966907 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966910 2567 flags.go:64] FLAG: --system-cgroups="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966913 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966919 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966922 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966924 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966928 2567 flags.go:64] FLAG: --tls-min-version="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966931 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966933 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966936 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966939 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966942 2567 flags.go:64] FLAG: --v="2" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966946 2567 flags.go:64] FLAG: --version="false" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966951 2567 flags.go:64] FLAG: --vmodule="" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966955 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.966959 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967050 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967057 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967061 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967064 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:06.975032 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967067 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967071 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967074 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967077 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967080 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967083 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967086 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967089 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967091 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967094 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967098 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967102 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967105 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967107 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967110 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967113 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967116 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967118 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967121 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967123 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:06.975997 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967126 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967129 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967145 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967148 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967151 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967154 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967157 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967159 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967162 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967165 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967168 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967171 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967173 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967176 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967179 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967181 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967184 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967187 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967190 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967193 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:06.976780 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967196 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967199 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967201 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967204 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967206 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967209 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967213 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967216 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967219 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967222 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967224 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967227 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967230 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967233 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967235 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967238 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967240 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967243 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967245 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967248 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:06.977532 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967250 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967255 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967258 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967260 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967263 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967266 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967268 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967271 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967273 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967276 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967280 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967284 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967287 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967290 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967293 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967295 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967298 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967301 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967305 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:06.978253 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967308 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:06.978951 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967311 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:06.978951 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.967313 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:06.978951 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.968046 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:11:06.984608 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.984581 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 09:11:06.984608 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.984607 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984658 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984664 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984669 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984674 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984680 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984683 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984687 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984690 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984693 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984696 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984699 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984702 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984705 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984707 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984710 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984713 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984716 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984718 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984721 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:06.984742 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984724 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984726 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984729 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984732 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984734 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984737 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984739 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984743 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984748 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984752 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984757 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984760 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984762 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984765 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984768 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984770 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984773 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984775 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984778 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984781 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:06.985251 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984784 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984786 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984788 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984791 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984794 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984797 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984800 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984803 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984807 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984810 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984814 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984819 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984823 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984829 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984834 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984837 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984840 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984844 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984846 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:06.985752 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984850 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984853 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984856 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984858 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984861 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984863 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984866 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984869 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984871 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984874 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984876 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984879 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984881 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984884 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984886 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984889 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984891 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984894 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984898 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984902 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:06.986232 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984906 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984910 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984913 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984916 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984919 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984922 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984924 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.984927 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.984933 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985035 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985041 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985044 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985047 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985050 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985054 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985059 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:11:06.986744 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985064 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985068 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985071 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985075 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985077 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985080 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985083 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985086 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985088 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985091 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985094 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985096 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985099 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985101 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985104 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985107 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985110 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985113 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985115 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985118 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:11:06.987150 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985121 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985123 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985126 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985129 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985147 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985151 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985154 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985157 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985159 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985162 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985165 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985168 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985170 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985173 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985177 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985180 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985183 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985186 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985188 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:11:06.987640 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985191 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985193 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985196 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985199 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985201 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985204 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985206 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985209 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985211 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985216 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985220 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985224 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985228 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985231 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985234 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985236 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985239 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985242 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985244 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985247 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:11:06.988099 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985249 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985252 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985254 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985259 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985262 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985265 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985269 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985271 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985274 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985277 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985279 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985282 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985284 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985287 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985290 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985293 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985298 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985302 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985306 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:11:06.989129 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:06.985309 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:11:06.989622 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.985314 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:11:06.989622 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.985434 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 09:11:06.989936 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.989922 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 09:11:06.990795 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.990784 2567 server.go:1019] "Starting client certificate rotation" Apr 17 09:11:06.990893 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.990877 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:11:06.990926 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:06.990917 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:11:07.011174 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.011116 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:11:07.013117 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.013098 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:11:07.025167 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.025145 2567 log.go:25] "Validated CRI v1 runtime API" Apr 17 09:11:07.036403 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.036381 2567 log.go:25] "Validated CRI v1 image API" Apr 17 09:11:07.038304 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.038284 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:11:07.039104 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.039089 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 09:11:07.041047 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.041025 2567 fs.go:135] Filesystem UUIDs: map[5bdb2a8b-47ee-4c87-ae1d-e0f6ae5a47cf:/dev/nvme0n1p3 70c5e648-d8a4-40b5-899b-c713e3073e27:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 17 09:11:07.041109 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.041043 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 09:11:07.046564 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.046464 2567 manager.go:217] Machine: {Timestamp:2026-04-17 09:11:07.044580222 +0000 UTC m=+0.369398210 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099793 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec251d56abfd873cfeb82cc025ecc166 SystemUUID:ec251d56-abfd-873c-feb8-2cc025ecc166 BootID:8179f3c6-6488-49b0-8d93-032efe4916b5 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:15:f4:90:eb:c1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:15:f4:90:eb:c1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:73:4e:a6:7f:b2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 09:11:07.046564 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.046559 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 09:11:07.046672 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.046636 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 09:11:07.048145 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.048117 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 09:11:07.048276 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.048149 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-212.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 09:11:07.048321 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.048286 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 09:11:07.048321 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.048294 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 09:11:07.048321 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.048307 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:11:07.048952 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.048943 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:11:07.050344 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.050333 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:11:07.050600 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.050590 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 09:11:07.052839 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.052828 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 17 09:11:07.052871 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.052850 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 09:11:07.052871 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.052862 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 09:11:07.052871 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.052871 2567 kubelet.go:397] "Adding apiserver pod source" Apr 17 09:11:07.052943 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.052879 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 09:11:07.053812 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.053799 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:11:07.053854 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.053819 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:11:07.056232 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.056214 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 09:11:07.057687 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.057670 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 09:11:07.058387 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.058372 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sk7nn" Apr 17 09:11:07.059367 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059356 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059373 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059380 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059386 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059391 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059397 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059404 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 09:11:07.059409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059409 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 09:11:07.059580 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059416 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 09:11:07.059580 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059423 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 09:11:07.059580 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059431 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 09:11:07.059580 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.059440 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 09:11:07.060176 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.060166 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 09:11:07.060176 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.060176 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 09:11:07.063676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.063663 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 09:11:07.063813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.063800 2567 server.go:1295] "Started kubelet" Apr 17 09:11:07.063930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.063897 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 09:11:07.064103 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.063957 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 09:11:07.064189 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.064175 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 09:11:07.064697 ip-10-0-128-212 systemd[1]: Started Kubernetes Kubelet. Apr 17 09:11:07.065805 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.065295 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 09:11:07.065805 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.065446 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-212.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 09:11:07.065984 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.065869 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-212.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 09:11:07.065984 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.065869 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 09:11:07.065984 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.065921 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 17 09:11:07.067382 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.067357 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sk7nn" Apr 17 09:11:07.070216 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.067887 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-212.ec2.internal.18a719e91f4587b5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-212.ec2.internal,UID:ip-10-0-128-212.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-212.ec2.internal,},FirstTimestamp:2026-04-17 09:11:07.063674805 +0000 UTC m=+0.388492775,LastTimestamp:2026-04-17 09:11:07.063674805 +0000 UTC m=+0.388492775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-212.ec2.internal,}" Apr 17 09:11:07.073686 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.073667 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 09:11:07.074330 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.074314 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 09:11:07.074807 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.074790 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 09:11:07.075707 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.075690 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.076086 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.076069 2567 factory.go:55] Registering systemd factory Apr 17 09:11:07.076189 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.076089 2567 factory.go:223] Registration of the systemd container factory successfully Apr 17 09:11:07.076988 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.076966 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 09:11:07.077117 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077098 2567 factory.go:153] Registering CRI-O factory Apr 17 09:11:07.077238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077120 2567 factory.go:223] Registration of the crio container factory successfully Apr 17 09:11:07.077238 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.077118 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 09:11:07.077238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077173 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 09:11:07.077238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077184 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 09:11:07.077238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077199 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 09:11:07.077238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077229 2567 factory.go:103] Registering Raw factory Apr 17 09:11:07.077531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077248 2567 manager.go:1196] Started watching for new ooms in manager Apr 17 09:11:07.077531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077316 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 17 09:11:07.077531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077325 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 17 09:11:07.077710 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.077693 2567 manager.go:319] Starting recovery of all containers Apr 17 09:11:07.087023 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.086919 2567 manager.go:324] Recovery completed Apr 17 09:11:07.090952 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.090940 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:07.091969 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.091951 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-212.ec2.internal\" not found" node="ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.093178 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.093165 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:07.093255 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.093193 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:07.093255 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.093208 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:07.093690 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.093673 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 09:11:07.093690 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.093689 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 09:11:07.093820 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.093705 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:11:07.096852 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.096841 2567 policy_none.go:49] "None policy: Start" Apr 17 09:11:07.096900 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.096856 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 09:11:07.096900 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.096866 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 17 09:11:07.142801 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.142789 2567 manager.go:341] "Starting Device Plugin manager" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.142827 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.142837 2567 server.go:85] "Starting device plugin registration server" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.143068 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.143079 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.143212 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.143275 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.143284 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.143787 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 09:11:07.170930 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.143822 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.243363 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.243337 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:07.244354 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.244338 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:07.244442 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.244369 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:07.244442 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.244384 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:07.244520 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.244450 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.244950 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.244932 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 09:11:07.247379 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.247356 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 09:11:07.247487 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.247391 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 09:11:07.247487 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.247415 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 09:11:07.247487 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.247425 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 09:11:07.247638 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.247496 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 09:11:07.252217 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.252201 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:07.252493 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.252478 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.252536 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.252500 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-212.ec2.internal\": node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.301321 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.301269 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.348366 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.348345 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal"] Apr 17 09:11:07.348424 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.348407 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:07.349171 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.349157 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:07.349240 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.349179 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:07.349240 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.349189 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:07.351584 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.351572 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:07.351710 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.351697 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.351749 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.351728 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:07.352227 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.352208 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:07.352298 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.352241 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:07.352298 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.352252 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:07.352298 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.352212 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:07.352411 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.352314 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:07.352411 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.352329 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:07.354480 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.354466 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.354554 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.354489 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:11:07.355110 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.355095 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:11:07.355190 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.355121 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:11:07.355190 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.355131 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:11:07.378489 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.378473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/649f656ea529e6ecb6cb9249a18750c7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-212.ec2.internal\" (UID: \"649f656ea529e6ecb6cb9249a18750c7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.378575 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.378496 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/303a407608e25060cece5ea7945f6cb8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal\" (UID: \"303a407608e25060cece5ea7945f6cb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.378575 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.378514 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/303a407608e25060cece5ea7945f6cb8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal\" (UID: \"303a407608e25060cece5ea7945f6cb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.379786 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.379772 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-212.ec2.internal\" not found" node="ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.384086 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.384070 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-212.ec2.internal\" not found" node="ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.401562 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.401545 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.478744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.478717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/303a407608e25060cece5ea7945f6cb8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal\" (UID: \"303a407608e25060cece5ea7945f6cb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.478826 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.478747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/303a407608e25060cece5ea7945f6cb8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal\" (UID: \"303a407608e25060cece5ea7945f6cb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.478826 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.478772 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/649f656ea529e6ecb6cb9249a18750c7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-212.ec2.internal\" (UID: \"649f656ea529e6ecb6cb9249a18750c7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.478826 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.478804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/303a407608e25060cece5ea7945f6cb8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal\" (UID: \"303a407608e25060cece5ea7945f6cb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.478826 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.478816 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/303a407608e25060cece5ea7945f6cb8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal\" (UID: \"303a407608e25060cece5ea7945f6cb8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.478976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.478804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/649f656ea529e6ecb6cb9249a18750c7-config\") pod \"kube-apiserver-proxy-ip-10-0-128-212.ec2.internal\" (UID: \"649f656ea529e6ecb6cb9249a18750c7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.501809 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.501785 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.602700 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.602641 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.682277 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.682257 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.686841 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.686823 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" Apr 17 09:11:07.703469 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.703441 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.804059 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.804038 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.904695 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:07.904630 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:07.926572 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.926550 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:07.991207 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.991183 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 09:11:07.991761 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.991298 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:11:07.991761 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:07.991341 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:11:08.005328 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:08.005300 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:08.070523 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.070496 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 09:06:07 +0000 UTC" deadline="2027-10-01 11:19:08.674084613 +0000 UTC" Apr 17 09:11:08.070523 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.070519 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12770h8m0.603568812s" Apr 17 09:11:08.074962 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.074941 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 09:11:08.088560 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.088531 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:11:08.105741 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:08.105720 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:08.105823 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.105745 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6b42x" Apr 17 09:11:08.113386 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.113370 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6b42x" Apr 17 09:11:08.182183 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.182085 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:08.206787 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:08.206766 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:08.231001 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:08.230973 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod649f656ea529e6ecb6cb9249a18750c7.slice/crio-8955278c552a08ca4bfc808bf6344707b97742cfc5c8806ede9b7c9b62b24fc8 WatchSource:0}: Error finding container 8955278c552a08ca4bfc808bf6344707b97742cfc5c8806ede9b7c9b62b24fc8: Status 404 returned error can't find the container with id 8955278c552a08ca4bfc808bf6344707b97742cfc5c8806ede9b7c9b62b24fc8 Apr 17 09:11:08.235735 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.235723 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:11:08.252174 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.252116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" event={"ID":"649f656ea529e6ecb6cb9249a18750c7","Type":"ContainerStarted","Data":"8955278c552a08ca4bfc808bf6344707b97742cfc5c8806ede9b7c9b62b24fc8"} Apr 17 09:11:08.307747 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:08.307722 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-212.ec2.internal\" not found" Apr 17 09:11:08.335976 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:08.335946 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod303a407608e25060cece5ea7945f6cb8.slice/crio-55c279bf0a05fa6be7d4819f298dbc1dbbff3d8b788eb7b474f66f230a1094ce WatchSource:0}: Error finding container 55c279bf0a05fa6be7d4819f298dbc1dbbff3d8b788eb7b474f66f230a1094ce: Status 404 returned error can't find the container with id 55c279bf0a05fa6be7d4819f298dbc1dbbff3d8b788eb7b474f66f230a1094ce Apr 17 09:11:08.360800 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.360780 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:08.375974 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.375957 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" Apr 17 09:11:08.387686 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.387666 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:11:08.388394 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.388382 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" Apr 17 09:11:08.396785 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.396771 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:11:08.963335 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:08.963257 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:09.054216 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.054188 2567 apiserver.go:52] "Watching apiserver" Apr 17 09:11:09.061487 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.061462 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 09:11:09.061903 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.061876 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal","openshift-multus/multus-6nv4m","openshift-multus/network-metrics-daemon-qpnfd","openshift-dns/node-resolver-8xtb4","openshift-multus/multus-additional-cni-plugins-xswsv","openshift-network-diagnostics/network-check-target-qqb8r","openshift-network-operator/iptables-alerter-2jtdp","openshift-ovn-kubernetes/ovnkube-node-bqmpm","kube-system/konnectivity-agent-gcm2d","kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg","openshift-cluster-node-tuning-operator/tuned-x8vtz","openshift-image-registry/node-ca-xx56q"] Apr 17 09:11:09.064637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.064615 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.067290 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.067269 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nkzcx\"" Apr 17 09:11:09.068334 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.067971 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.068426 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.068339 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.068426 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.068354 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 09:11:09.069909 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.069292 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.072128 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.072108 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.072304 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.072285 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.072415 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.072402 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.072513 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.072489 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:09.073070 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.073043 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jh7bg\"" Apr 17 09:11:09.073181 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.073090 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 09:11:09.073851 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.073702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.073851 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.073769 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 09:11:09.075483 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.075466 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.075720 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.075700 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wftd8\"" Apr 17 09:11:09.075895 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.075878 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.077616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.077223 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.077616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.077228 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:09.077616 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.077541 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:09.079639 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.079619 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lhfqr\"" Apr 17 09:11:09.079737 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.079654 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 09:11:09.079737 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.079655 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 09:11:09.081261 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.081241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.084343 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084320 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 09:11:09.084808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084599 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 09:11:09.084808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084617 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 09:11:09.084808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084678 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.084808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084684 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 09:11:09.084808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084696 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.085090 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.084861 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vr8x8\"" Apr 17 09:11:09.086154 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.085985 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.086154 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.086076 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.088756 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088518 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.088756 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088517 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-k8s-cni-cncf-io\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088771 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-cni-multus\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-host-slash\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftj5\" (UniqueName: \"kubernetes.io/projected/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-kube-api-access-xftj5\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088842 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088852 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-cnibin\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.088897 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088885 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7bs\" (UniqueName: \"kubernetes.io/projected/4f93c6af-eabd-453e-9966-3199a8d4a534-kube-api-access-mj7bs\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088916 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cnibin\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-cni-bin\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088964 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-kubelet\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088969 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.088987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-conf-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-hostroot\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089026 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089035 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-daemon-config\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-iptables-alerter-script\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089102 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-system-cni-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-etc-kubernetes\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089210 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lsh\" (UniqueName: \"kubernetes.io/projected/5029b845-d556-4306-b1bb-4c6373b7e4be-kube-api-access-x4lsh\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-socket-dir-parent\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089247 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089246 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4298b\"" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-system-cni-dir\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-os-release\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089309 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cni-binary-copy\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089360 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-multus-certs\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2tv\" (UniqueName: \"kubernetes.io/projected/528b8329-d2cd-4d99-8a3e-62f7af17b361-kube-api-access-zv2tv\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089447 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-netns\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089513 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089539 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z6sn\" (UniqueName: \"kubernetes.io/projected/a0d17817-9cb4-4adc-9cb1-ace0055c7639-kube-api-access-6z6sn\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f93c6af-eabd-453e-9966-3199a8d4a534-hosts-file\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089587 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4f93c6af-eabd-453e-9966-3199a8d4a534-tmp-dir\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089658 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-cni-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089666 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jksqb\"" Apr 17 09:11:09.089890 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089691 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-os-release\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.090775 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/528b8329-d2cd-4d99-8a3e-62f7af17b361-cni-binary-copy\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.090775 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.089758 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.090949 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.090934 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.091568 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.091495 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.091666 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.091594 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.091818 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.091802 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-p9r2n\"" Apr 17 09:11:09.093381 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.093327 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 09:11:09.093481 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.093434 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 09:11:09.093630 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.093569 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 09:11:09.093630 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.093580 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wnmh2\"" Apr 17 09:11:09.114617 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.114580 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:06:08 +0000 UTC" deadline="2027-09-23 06:33:35.804351159 +0000 UTC" Apr 17 09:11:09.114742 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.114715 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12573h22m26.689643324s" Apr 17 09:11:09.178545 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.178518 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 09:11:09.190754 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg92d\" (UniqueName: \"kubernetes.io/projected/5f467c32-67b8-4e0a-b835-8b0933d2cc02-kube-api-access-dg92d\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.190881 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff136071-cfdc-4409-a0d9-ed959a609894-etc-tuned\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.190881 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190798 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f93c6af-eabd-453e-9966-3199a8d4a534-hosts-file\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.190881 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4f93c6af-eabd-453e-9966-3199a8d4a534-tmp-dir\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.190881 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-cni-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.190881 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190867 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/528b8329-d2cd-4d99-8a3e-62f7af17b361-cni-binary-copy\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190912 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-k8s-cni-cncf-io\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190938 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4f93c6af-eabd-453e-9966-3199a8d4a534-hosts-file\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-cni-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.190964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-k8s-cni-cncf-io\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.191043 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:09.191089 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191050 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-cni-multus\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xftj5\" (UniqueName: \"kubernetes.io/projected/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-kube-api-access-xftj5\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191150 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-cni-multus\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.191163 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:11:09.69110728 +0000 UTC m=+3.015925238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191187 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-cnibin\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4f93c6af-eabd-453e-9966-3199a8d4a534-tmp-dir\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191220 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-kubelet\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191247 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-ovn\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-cnibin\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-var-lib-kubelet\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191309 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtsd2\" (UniqueName: \"kubernetes.io/projected/ff136071-cfdc-4409-a0d9-ed959a609894-kube-api-access-dtsd2\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-cni-bin\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191380 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-kubelet\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.191422 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191422 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-kubelet\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-conf-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-var-lib-cni-bin\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-systemd-units\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/528b8329-d2cd-4d99-8a3e-62f7af17b361-cni-binary-copy\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191537 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-conf-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-systemd\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191573 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-hostroot\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191597 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysconfig\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-hostroot\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191635 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysctl-conf\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191660 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-host\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-iptables-alerter-script\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-system-cni-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191729 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-etc-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191749 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovnkube-config\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ff0b766-7823-4f42-b84c-f3c1ac91941c-agent-certs\") pod \"konnectivity-agent-gcm2d\" (UID: \"3ff0b766-7823-4f42-b84c-f3c1ac91941c\") " pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.192022 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191808 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-system-cni-dir\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191827 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-etc-selinux\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191844 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-run\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-os-release\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2tv\" (UniqueName: \"kubernetes.io/projected/528b8329-d2cd-4d99-8a3e-62f7af17b361-kube-api-access-zv2tv\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191920 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-registration-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191946 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-sys\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191967 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-os-release\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.191994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-device-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192070 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-lib-modules\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192109 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-os-release\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192156 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-iptables-alerter-script\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192168 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.192817 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192197 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-cni-bin\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192203 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-os-release\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192224 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-host-slash\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192253 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-log-socket\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192274 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-host-slash\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192305 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7bs\" (UniqueName: \"kubernetes.io/projected/4f93c6af-eabd-453e-9966-3199a8d4a534-kube-api-access-mj7bs\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cnibin\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192388 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qg5\" (UniqueName: \"kubernetes.io/projected/8fda692b-3c65-4165-b88c-ab992a58a369-kube-api-access-p2qg5\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192404 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-run-netns\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-socket-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cnibin\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192437 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj8nb\" (UniqueName: \"kubernetes.io/projected/52e471a9-821a-4d18-b636-4a8e2b41a8bc-kube-api-access-bj8nb\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192452 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-modprobe-d\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-daemon-config\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-node-log\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.193625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192590 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-env-overrides\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192623 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysctl-d\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-etc-kubernetes\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192711 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lsh\" (UniqueName: \"kubernetes.io/projected/5029b845-d556-4306-b1bb-4c6373b7e4be-kube-api-access-x4lsh\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-etc-kubernetes\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192737 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-var-lib-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192783 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-sys-fs\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192811 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-kubernetes\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192836 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-socket-dir-parent\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-system-cni-dir\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192906 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cni-binary-copy\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192925 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-socket-dir-parent\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192925 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/528b8329-d2cd-4d99-8a3e-62f7af17b361-multus-daemon-config\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192964 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-system-cni-dir\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.192995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-multus-certs\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193023 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8fda692b-3c65-4165-b88c-ab992a58a369-serviceca\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.194409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193045 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-multus-certs\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-slash\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193099 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-cni-netd\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-netns\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193239 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193264 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ff0b766-7823-4f42-b84c-f3c1ac91941c-konnectivity-ca\") pod \"konnectivity-agent-gcm2d\" (UID: \"3ff0b766-7823-4f42-b84c-f3c1ac91941c\") " pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193274 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0d17817-9cb4-4adc-9cb1-ace0055c7639-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193265 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/528b8329-d2cd-4d99-8a3e-62f7af17b361-host-run-netns\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193290 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-systemd\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff136071-cfdc-4409-a0d9-ed959a609894-tmp\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193391 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-cni-binary-copy\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193404 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z6sn\" (UniqueName: \"kubernetes.io/projected/a0d17817-9cb4-4adc-9cb1-ace0055c7639-kube-api-access-6z6sn\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193430 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fda692b-3c65-4165-b88c-ab992a58a369-host\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193447 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovnkube-script-lib\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.195169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.193801 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a0d17817-9cb4-4adc-9cb1-ace0055c7639-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.198670 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.198507 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:09.198670 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.198532 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:09.198670 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.198546 2567 projected.go:194] Error preparing data for projected volume kube-api-access-qtzqk for pod openshift-network-diagnostics/network-check-target-qqb8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:09.198670 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.198574 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 09:11:09.198859 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.198706 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk podName:0b55fe37-b595-4cdf-a226-39f50d91d206 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:09.69869031 +0000 UTC m=+3.023508284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qtzqk" (UniqueName: "kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk") pod "network-check-target-qqb8r" (UID: "0b55fe37-b595-4cdf-a226-39f50d91d206") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:09.202228 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.202208 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z6sn\" (UniqueName: \"kubernetes.io/projected/a0d17817-9cb4-4adc-9cb1-ace0055c7639-kube-api-access-6z6sn\") pod \"multus-additional-cni-plugins-xswsv\" (UID: \"a0d17817-9cb4-4adc-9cb1-ace0055c7639\") " pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.202335 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.202222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2tv\" (UniqueName: \"kubernetes.io/projected/528b8329-d2cd-4d99-8a3e-62f7af17b361-kube-api-access-zv2tv\") pod \"multus-6nv4m\" (UID: \"528b8329-d2cd-4d99-8a3e-62f7af17b361\") " pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.202335 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.202248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftj5\" (UniqueName: \"kubernetes.io/projected/39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a-kube-api-access-xftj5\") pod \"iptables-alerter-2jtdp\" (UID: \"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a\") " pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.202335 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.202279 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lsh\" (UniqueName: \"kubernetes.io/projected/5029b845-d556-4306-b1bb-4c6373b7e4be-kube-api-access-x4lsh\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.202625 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.202609 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7bs\" (UniqueName: \"kubernetes.io/projected/4f93c6af-eabd-453e-9966-3199a8d4a534-kube-api-access-mj7bs\") pod \"node-resolver-8xtb4\" (UID: \"4f93c6af-eabd-453e-9966-3199a8d4a534\") " pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.254773 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.254698 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" event={"ID":"303a407608e25060cece5ea7945f6cb8","Type":"ContainerStarted","Data":"55c279bf0a05fa6be7d4819f298dbc1dbbff3d8b788eb7b474f66f230a1094ce"} Apr 17 09:11:09.294254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-var-lib-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-sys-fs\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.294410 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-sys-fs\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.294410 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294319 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-var-lib-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294410 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-kubernetes\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.294410 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8fda692b-3c65-4165-b88c-ab992a58a369-serviceca\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.294410 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-kubernetes\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-slash\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294460 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-cni-netd\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294482 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ff0b766-7823-4f42-b84c-f3c1ac91941c-konnectivity-ca\") pod \"konnectivity-agent-gcm2d\" (UID: \"3ff0b766-7823-4f42-b84c-f3c1ac91941c\") " pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294542 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-cni-netd\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-slash\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-systemd\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff136071-cfdc-4409-a0d9-ed959a609894-tmp\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294615 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fda692b-3c65-4165-b88c-ab992a58a369-host\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.294637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovnkube-script-lib\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg92d\" (UniqueName: \"kubernetes.io/projected/5f467c32-67b8-4e0a-b835-8b0933d2cc02-kube-api-access-dg92d\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294683 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-systemd\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff136071-cfdc-4409-a0d9-ed959a609894-etc-tuned\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294734 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8fda692b-3c65-4165-b88c-ab992a58a369-host\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-kubelet\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-ovn\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294803 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-var-lib-kubelet\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtsd2\" (UniqueName: \"kubernetes.io/projected/ff136071-cfdc-4409-a0d9-ed959a609894-kube-api-access-dtsd2\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294833 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8fda692b-3c65-4165-b88c-ab992a58a369-serviceca\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-systemd-units\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294877 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-systemd\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-var-lib-kubelet\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-systemd\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.294992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-run-ovn\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysconfig\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysctl-conf\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ff0b766-7823-4f42-b84c-f3c1ac91941c-konnectivity-ca\") pod \"konnectivity-agent-gcm2d\" (UID: \"3ff0b766-7823-4f42-b84c-f3c1ac91941c\") " pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.295163 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-host\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295075 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-kubelet\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295045 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysconfig\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295097 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-etc-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295121 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-systemd-units\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-host\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovnkube-config\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295186 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-etc-openvswitch\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ff0b766-7823-4f42-b84c-f3c1ac91941c-agent-certs\") pod \"konnectivity-agent-gcm2d\" (UID: \"3ff0b766-7823-4f42-b84c-f3c1ac91941c\") " pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-etc-selinux\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysctl-conf\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295260 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-run\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-run\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295299 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-registration-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-sys\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-device-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295365 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovnkube-script-lib\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.295998 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295393 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-etc-selinux\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295391 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295437 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-sys\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-device-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295443 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295480 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-lib-modules\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-registration-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295525 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-cni-bin\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-log-socket\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295570 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295594 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovnkube-config\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-lib-modules\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qg5\" (UniqueName: \"kubernetes.io/projected/8fda692b-3c65-4165-b88c-ab992a58a369-kube-api-access-p2qg5\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.296880 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295616 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-cni-bin\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295650 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-log-socket\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-run-netns\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295703 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-socket-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295738 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj8nb\" (UniqueName: \"kubernetes.io/projected/52e471a9-821a-4d18-b636-4a8e2b41a8bc-kube-api-access-bj8nb\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295768 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-modprobe-d\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-node-log\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295801 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/52e471a9-821a-4d18-b636-4a8e2b41a8bc-socket-dir\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-env-overrides\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-host-run-netns\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295847 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysctl-d\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-modprobe-d\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295970 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff136071-cfdc-4409-a0d9-ed959a609894-etc-sysctl-d\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.295991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5f467c32-67b8-4e0a-b835-8b0933d2cc02-node-log\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.296288 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5f467c32-67b8-4e0a-b835-8b0933d2cc02-env-overrides\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.297294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff136071-cfdc-4409-a0d9-ed959a609894-tmp\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.297676 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.297534 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff136071-cfdc-4409-a0d9-ed959a609894-etc-tuned\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.298369 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.297916 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ff0b766-7823-4f42-b84c-f3c1ac91941c-agent-certs\") pod \"konnectivity-agent-gcm2d\" (UID: \"3ff0b766-7823-4f42-b84c-f3c1ac91941c\") " pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.298369 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.298044 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5f467c32-67b8-4e0a-b835-8b0933d2cc02-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.303336 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.303309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg92d\" (UniqueName: \"kubernetes.io/projected/5f467c32-67b8-4e0a-b835-8b0933d2cc02-kube-api-access-dg92d\") pod \"ovnkube-node-bqmpm\" (UID: \"5f467c32-67b8-4e0a-b835-8b0933d2cc02\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.303438 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.303334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtsd2\" (UniqueName: \"kubernetes.io/projected/ff136071-cfdc-4409-a0d9-ed959a609894-kube-api-access-dtsd2\") pod \"tuned-x8vtz\" (UID: \"ff136071-cfdc-4409-a0d9-ed959a609894\") " pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.303652 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.303632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qg5\" (UniqueName: \"kubernetes.io/projected/8fda692b-3c65-4165-b88c-ab992a58a369-kube-api-access-p2qg5\") pod \"node-ca-xx56q\" (UID: \"8fda692b-3c65-4165-b88c-ab992a58a369\") " pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.303953 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.303933 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj8nb\" (UniqueName: \"kubernetes.io/projected/52e471a9-821a-4d18-b636-4a8e2b41a8bc-kube-api-access-bj8nb\") pod \"aws-ebs-csi-driver-node-jfrbg\" (UID: \"52e471a9-821a-4d18-b636-4a8e2b41a8bc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.380898 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.380863 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2jtdp" Apr 17 09:11:09.388713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.388689 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6nv4m" Apr 17 09:11:09.405447 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.405422 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8xtb4" Apr 17 09:11:09.410958 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.410940 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xswsv" Apr 17 09:11:09.418569 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.418548 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:09.428010 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.427989 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" Apr 17 09:11:09.435622 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.435598 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:09.443273 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.443256 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" Apr 17 09:11:09.449762 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.449742 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xx56q" Apr 17 09:11:09.698461 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.698378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:09.698610 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.698494 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:09.698610 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.698553 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:11:10.698533829 +0000 UTC m=+4.023351785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:09.798919 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:09.798885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:09.799072 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.799044 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:09.799072 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.799067 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:09.799209 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.799077 2567 projected.go:194] Error preparing data for projected volume kube-api-access-qtzqk for pod openshift-network-diagnostics/network-check-target-qqb8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:09.799209 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:09.799145 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk podName:0b55fe37-b595-4cdf-a226-39f50d91d206 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:10.799117581 +0000 UTC m=+4.123935535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qtzqk" (UniqueName: "kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk") pod "network-check-target-qqb8r" (UID: "0b55fe37-b595-4cdf-a226-39f50d91d206") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:09.923784 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.923761 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e471a9_821a_4d18_b636_4a8e2b41a8bc.slice/crio-d6154ebe16a1490f35c106fae3b73abd219665d53e670e92ab824611acbb5838 WatchSource:0}: Error finding container d6154ebe16a1490f35c106fae3b73abd219665d53e670e92ab824611acbb5838: Status 404 returned error can't find the container with id d6154ebe16a1490f35c106fae3b73abd219665d53e670e92ab824611acbb5838 Apr 17 09:11:09.925037 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.925011 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f93c6af_eabd_453e_9966_3199a8d4a534.slice/crio-ab884173ce7c598744102f31e9d2e8e015db6edd3f17aadee93a5a92c661a901 WatchSource:0}: Error finding container ab884173ce7c598744102f31e9d2e8e015db6edd3f17aadee93a5a92c661a901: Status 404 returned error can't find the container with id ab884173ce7c598744102f31e9d2e8e015db6edd3f17aadee93a5a92c661a901 Apr 17 09:11:09.926127 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.926100 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39de49a9_8aec_4d6b_a2e3_71a3ddfa6b0a.slice/crio-8f065bb2c89e070e20f928c093f3c2fbab8dfdeac7b665864d8eac5c886fbf15 WatchSource:0}: Error finding container 8f065bb2c89e070e20f928c093f3c2fbab8dfdeac7b665864d8eac5c886fbf15: Status 404 returned error can't find the container with id 8f065bb2c89e070e20f928c093f3c2fbab8dfdeac7b665864d8eac5c886fbf15 Apr 17 09:11:09.927619 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.927523 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fda692b_3c65_4165_b88c_ab992a58a369.slice/crio-b2cf90defafa405d0f542f2168a8a4b0919c0f638a0a31c70ab323e1c264cd78 WatchSource:0}: Error finding container b2cf90defafa405d0f542f2168a8a4b0919c0f638a0a31c70ab323e1c264cd78: Status 404 returned error can't find the container with id b2cf90defafa405d0f542f2168a8a4b0919c0f638a0a31c70ab323e1c264cd78 Apr 17 09:11:09.929334 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.929311 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff136071_cfdc_4409_a0d9_ed959a609894.slice/crio-3c727b548593cb18e05442f310fcf21ef7e388c26fc316c92a4fffe829392acd WatchSource:0}: Error finding container 3c727b548593cb18e05442f310fcf21ef7e388c26fc316c92a4fffe829392acd: Status 404 returned error can't find the container with id 3c727b548593cb18e05442f310fcf21ef7e388c26fc316c92a4fffe829392acd Apr 17 09:11:09.930468 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.930447 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0d17817_9cb4_4adc_9cb1_ace0055c7639.slice/crio-85902fe5f942644ecd5118087fbc966f760d62a78a84a69b899791792aa028c9 WatchSource:0}: Error finding container 85902fe5f942644ecd5118087fbc966f760d62a78a84a69b899791792aa028c9: Status 404 returned error can't find the container with id 85902fe5f942644ecd5118087fbc966f760d62a78a84a69b899791792aa028c9 Apr 17 09:11:09.932541 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.932487 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528b8329_d2cd_4d99_8a3e_62f7af17b361.slice/crio-e266338dc017b2d191f1fb08e12287d85a88c8dfed3ca9dc2a45b02a9d4a7407 WatchSource:0}: Error finding container e266338dc017b2d191f1fb08e12287d85a88c8dfed3ca9dc2a45b02a9d4a7407: Status 404 returned error can't find the container with id e266338dc017b2d191f1fb08e12287d85a88c8dfed3ca9dc2a45b02a9d4a7407 Apr 17 09:11:09.932776 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.932753 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff0b766_7823_4f42_b84c_f3c1ac91941c.slice/crio-5d524de6b5887b2c3981194b2839fb787355b0f4b89e74a41a6569599d7ee54c WatchSource:0}: Error finding container 5d524de6b5887b2c3981194b2839fb787355b0f4b89e74a41a6569599d7ee54c: Status 404 returned error can't find the container with id 5d524de6b5887b2c3981194b2839fb787355b0f4b89e74a41a6569599d7ee54c Apr 17 09:11:09.933704 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:09.933672 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f467c32_67b8_4e0a_b835_8b0933d2cc02.slice/crio-50a871b314615992e290172b4cc5f857f11be0aa428165b71c42043e065bc6a2 WatchSource:0}: Error finding container 50a871b314615992e290172b4cc5f857f11be0aa428165b71c42043e065bc6a2: Status 404 returned error can't find the container with id 50a871b314615992e290172b4cc5f857f11be0aa428165b71c42043e065bc6a2 Apr 17 09:11:10.115875 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.115845 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:06:08 +0000 UTC" deadline="2027-10-24 12:43:22.20052265 +0000 UTC" Apr 17 09:11:10.115875 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.115872 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13323h32m12.084653291s" Apr 17 09:11:10.257895 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.257821 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" event={"ID":"649f656ea529e6ecb6cb9249a18750c7","Type":"ContainerStarted","Data":"2e9e090b2698e9fecc2086d95bfe7e918fb74372f3e4cfca9e6366402194fc4c"} Apr 17 09:11:10.258844 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.258819 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"50a871b314615992e290172b4cc5f857f11be0aa428165b71c42043e065bc6a2"} Apr 17 09:11:10.259840 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.259815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gcm2d" event={"ID":"3ff0b766-7823-4f42-b84c-f3c1ac91941c","Type":"ContainerStarted","Data":"5d524de6b5887b2c3981194b2839fb787355b0f4b89e74a41a6569599d7ee54c"} Apr 17 09:11:10.260841 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.260818 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerStarted","Data":"85902fe5f942644ecd5118087fbc966f760d62a78a84a69b899791792aa028c9"} Apr 17 09:11:10.261813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.261781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xx56q" event={"ID":"8fda692b-3c65-4165-b88c-ab992a58a369","Type":"ContainerStarted","Data":"b2cf90defafa405d0f542f2168a8a4b0919c0f638a0a31c70ab323e1c264cd78"} Apr 17 09:11:10.262758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.262736 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nv4m" event={"ID":"528b8329-d2cd-4d99-8a3e-62f7af17b361","Type":"ContainerStarted","Data":"e266338dc017b2d191f1fb08e12287d85a88c8dfed3ca9dc2a45b02a9d4a7407"} Apr 17 09:11:10.263654 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.263635 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" event={"ID":"ff136071-cfdc-4409-a0d9-ed959a609894","Type":"ContainerStarted","Data":"3c727b548593cb18e05442f310fcf21ef7e388c26fc316c92a4fffe829392acd"} Apr 17 09:11:10.264547 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.264521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8xtb4" event={"ID":"4f93c6af-eabd-453e-9966-3199a8d4a534","Type":"ContainerStarted","Data":"ab884173ce7c598744102f31e9d2e8e015db6edd3f17aadee93a5a92c661a901"} Apr 17 09:11:10.267732 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.267705 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2jtdp" event={"ID":"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a","Type":"ContainerStarted","Data":"8f065bb2c89e070e20f928c093f3c2fbab8dfdeac7b665864d8eac5c886fbf15"} Apr 17 09:11:10.268815 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.268798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" event={"ID":"52e471a9-821a-4d18-b636-4a8e2b41a8bc","Type":"ContainerStarted","Data":"d6154ebe16a1490f35c106fae3b73abd219665d53e670e92ab824611acbb5838"} Apr 17 09:11:10.271185 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.271126 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-212.ec2.internal" podStartSLOduration=2.271115499 podStartE2EDuration="2.271115499s" podCreationTimestamp="2026-04-17 09:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:11:10.270841562 +0000 UTC m=+3.595659550" watchObservedRunningTime="2026-04-17 09:11:10.271115499 +0000 UTC m=+3.595933475" Apr 17 09:11:10.706246 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.705694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:10.706246 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:10.705814 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:10.706246 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:10.705870 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:11:12.70585385 +0000 UTC m=+6.030671809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:10.796506 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.796479 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:11:10.806153 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:10.806113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:10.806311 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:10.806295 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:10.806377 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:10.806318 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:10.806377 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:10.806330 2567 projected.go:194] Error preparing data for projected volume kube-api-access-qtzqk for pod openshift-network-diagnostics/network-check-target-qqb8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:10.806470 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:10.806384 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk podName:0b55fe37-b595-4cdf-a226-39f50d91d206 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:12.806365997 +0000 UTC m=+6.131183963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qtzqk" (UniqueName: "kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk") pod "network-check-target-qqb8r" (UID: "0b55fe37-b595-4cdf-a226-39f50d91d206") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:11.248940 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:11.248573 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:11.248940 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:11.248667 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:11.248940 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:11.248584 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:11.248940 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:11.248801 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:11.287834 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:11.287089 2567 generic.go:358] "Generic (PLEG): container finished" podID="303a407608e25060cece5ea7945f6cb8" containerID="6e503fe73d31d262bff353b566ccce315ec1a4065f760369361dce25c8b8d6a9" exitCode=0 Apr 17 09:11:11.287834 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:11.287617 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" event={"ID":"303a407608e25060cece5ea7945f6cb8","Type":"ContainerDied","Data":"6e503fe73d31d262bff353b566ccce315ec1a4065f760369361dce25c8b8d6a9"} Apr 17 09:11:12.297462 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:12.297417 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" event={"ID":"303a407608e25060cece5ea7945f6cb8","Type":"ContainerStarted","Data":"7978f8457023e821842a488031b6d951036cf1767e09e0803b3c80573620c9e1"} Apr 17 09:11:12.312486 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:12.311893 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-212.ec2.internal" podStartSLOduration=4.311874703 podStartE2EDuration="4.311874703s" podCreationTimestamp="2026-04-17 09:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:11:12.311536356 +0000 UTC m=+5.636354357" watchObservedRunningTime="2026-04-17 09:11:12.311874703 +0000 UTC m=+5.636692681" Apr 17 09:11:12.722621 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:12.722517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:12.722778 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:12.722701 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:12.722778 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:12.722759 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:11:16.722740018 +0000 UTC m=+10.047557978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:12.823593 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:12.823554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:12.823764 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:12.823751 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:12.823844 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:12.823769 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:12.823844 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:12.823783 2567 projected.go:194] Error preparing data for projected volume kube-api-access-qtzqk for pod openshift-network-diagnostics/network-check-target-qqb8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:12.823844 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:12.823837 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk podName:0b55fe37-b595-4cdf-a226-39f50d91d206 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:16.823819672 +0000 UTC m=+10.148637626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qtzqk" (UniqueName: "kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk") pod "network-check-target-qqb8r" (UID: "0b55fe37-b595-4cdf-a226-39f50d91d206") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:13.248280 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:13.248246 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:13.248448 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:13.248255 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:13.248448 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:13.248406 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:13.248448 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:13.248423 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:15.248622 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.248590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:15.249056 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:15.248721 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:15.249056 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.248759 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:15.249056 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:15.248828 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:15.844752 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.844039 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vqkm2"] Apr 17 09:11:15.850329 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.849956 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:15.850329 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:15.850031 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:15.951250 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.951059 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cad943ff-ebd7-4dca-aac3-600408e2153a-dbus\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:15.951250 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.951109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:15.951250 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:15.951216 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cad943ff-ebd7-4dca-aac3-600408e2153a-kubelet-config\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.052845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.052567 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cad943ff-ebd7-4dca-aac3-600408e2153a-dbus\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.052845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.052619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.052845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.052710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cad943ff-ebd7-4dca-aac3-600408e2153a-kubelet-config\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.052845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.052794 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cad943ff-ebd7-4dca-aac3-600408e2153a-dbus\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.052845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.052813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cad943ff-ebd7-4dca-aac3-600408e2153a-kubelet-config\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.052845 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.052818 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:16.053279 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.052886 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret podName:cad943ff-ebd7-4dca-aac3-600408e2153a nodeName:}" failed. No retries permitted until 2026-04-17 09:11:16.552867888 +0000 UTC m=+9.877685846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret") pod "global-pull-secret-syncer-vqkm2" (UID: "cad943ff-ebd7-4dca-aac3-600408e2153a") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:16.558294 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.557829 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:16.558294 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.558036 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:16.558294 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.558098 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret podName:cad943ff-ebd7-4dca-aac3-600408e2153a nodeName:}" failed. No retries permitted until 2026-04-17 09:11:17.558079633 +0000 UTC m=+10.882897587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret") pod "global-pull-secret-syncer-vqkm2" (UID: "cad943ff-ebd7-4dca-aac3-600408e2153a") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:16.759356 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.759314 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:16.759536 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.759476 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:16.759595 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.759557 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:11:24.759535203 +0000 UTC m=+18.084353164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:16.859985 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:16.859900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:16.860159 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.860094 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:16.860159 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.860112 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:16.860159 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.860126 2567 projected.go:194] Error preparing data for projected volume kube-api-access-qtzqk for pod openshift-network-diagnostics/network-check-target-qqb8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:16.860328 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:16.860199 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk podName:0b55fe37-b595-4cdf-a226-39f50d91d206 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:24.86018241 +0000 UTC m=+18.185000370 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qtzqk" (UniqueName: "kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk") pod "network-check-target-qqb8r" (UID: "0b55fe37-b595-4cdf-a226-39f50d91d206") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:17.249683 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:17.248877 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:17.249683 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:17.248993 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:17.249683 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:17.249049 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:17.249683 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:17.249127 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:17.249683 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:17.249504 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:17.249683 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:17.249598 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:17.565556 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:17.565415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:17.566025 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:17.565609 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:17.566025 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:17.565701 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret podName:cad943ff-ebd7-4dca-aac3-600408e2153a nodeName:}" failed. No retries permitted until 2026-04-17 09:11:19.565667496 +0000 UTC m=+12.890485452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret") pod "global-pull-secret-syncer-vqkm2" (UID: "cad943ff-ebd7-4dca-aac3-600408e2153a") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:19.247724 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:19.247635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:19.247724 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:19.247670 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:19.248228 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:19.247648 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:19.248228 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:19.247761 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:19.248228 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:19.247840 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:19.248228 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:19.247904 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:19.582088 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:19.582015 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:19.582254 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:19.582176 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:19.582315 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:19.582259 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret podName:cad943ff-ebd7-4dca-aac3-600408e2153a nodeName:}" failed. No retries permitted until 2026-04-17 09:11:23.582238009 +0000 UTC m=+16.907055973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret") pod "global-pull-secret-syncer-vqkm2" (UID: "cad943ff-ebd7-4dca-aac3-600408e2153a") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:21.247853 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:21.247822 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:21.248302 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:21.247822 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:21.248302 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:21.247958 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:21.248302 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:21.247830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:21.248302 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:21.248046 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:21.248302 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:21.248235 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:23.247865 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:23.247824 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:23.248316 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:23.247824 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:23.248316 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:23.247968 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:23.248316 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:23.248011 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:23.248316 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:23.247824 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:23.248316 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:23.248093 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:23.612977 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:23.612878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:23.613124 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:23.613051 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:23.613124 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:23.613116 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret podName:cad943ff-ebd7-4dca-aac3-600408e2153a nodeName:}" failed. No retries permitted until 2026-04-17 09:11:31.613097338 +0000 UTC m=+24.937915299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret") pod "global-pull-secret-syncer-vqkm2" (UID: "cad943ff-ebd7-4dca-aac3-600408e2153a") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:24.822481 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:24.822446 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:24.822982 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:24.822593 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:24.822982 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:24.822666 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.822646717 +0000 UTC m=+34.147464671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:11:24.922928 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:24.922891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:24.923131 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:24.923092 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:11:24.923131 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:24.923123 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:11:24.923348 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:24.923151 2567 projected.go:194] Error preparing data for projected volume kube-api-access-qtzqk for pod openshift-network-diagnostics/network-check-target-qqb8r: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:24.923348 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:24.923218 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk podName:0b55fe37-b595-4cdf-a226-39f50d91d206 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.923199089 +0000 UTC m=+34.248017046 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qtzqk" (UniqueName: "kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk") pod "network-check-target-qqb8r" (UID: "0b55fe37-b595-4cdf-a226-39f50d91d206") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:11:25.248661 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:25.248627 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:25.248853 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:25.248627 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:25.248853 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:25.248760 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:25.248943 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:25.248866 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:25.248943 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:25.248635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:25.249026 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:25.248955 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:27.249548 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.249308 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:27.249548 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.249512 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:27.249548 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.249482 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:27.249916 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:27.249630 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:27.249916 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:27.249721 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:27.249916 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:27.249830 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:27.320916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.320884 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xx56q" event={"ID":"8fda692b-3c65-4165-b88c-ab992a58a369","Type":"ContainerStarted","Data":"ff1a0fb8fd56e8577b630ac7d941fc675063a6874bd47e5cfa8dfdaabddd198e"} Apr 17 09:11:27.322031 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.322007 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" event={"ID":"ff136071-cfdc-4409-a0d9-ed959a609894","Type":"ContainerStarted","Data":"d95b61d14b23b2febbcf75ed7d4125c007ca4e7eb887424a63d33ca725d51279"} Apr 17 09:11:27.323124 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.323096 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8xtb4" event={"ID":"4f93c6af-eabd-453e-9966-3199a8d4a534","Type":"ContainerStarted","Data":"8edc55b6d9cec50ed4e98c350b4d2d94a0119c8d6353735e90d11792aff6607e"} Apr 17 09:11:27.324235 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.324213 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" event={"ID":"52e471a9-821a-4d18-b636-4a8e2b41a8bc","Type":"ContainerStarted","Data":"ec9105167560de5f9ae36cf89a4aaf76814b76f78bde78e718d6483c48595fea"} Apr 17 09:11:27.325443 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.325424 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gcm2d" event={"ID":"3ff0b766-7823-4f42-b84c-f3c1ac91941c","Type":"ContainerStarted","Data":"69e517ff51481be11b7dd60cb16d48fe63daa2f1ecc5f3d89dfbaef00559a497"} Apr 17 09:11:27.340578 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.340528 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xx56q" podStartSLOduration=3.307872606 podStartE2EDuration="20.340511392s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.929416987 +0000 UTC m=+3.254234941" lastFinishedPulling="2026-04-17 09:11:26.962055765 +0000 UTC m=+20.286873727" observedRunningTime="2026-04-17 09:11:27.339713779 +0000 UTC m=+20.664531755" watchObservedRunningTime="2026-04-17 09:11:27.340511392 +0000 UTC m=+20.665329369" Apr 17 09:11:27.361264 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.361226 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x8vtz" podStartSLOduration=3.351827863 podStartE2EDuration="20.361213205s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.931429782 +0000 UTC m=+3.256247740" lastFinishedPulling="2026-04-17 09:11:26.940815125 +0000 UTC m=+20.265633082" observedRunningTime="2026-04-17 09:11:27.361178593 +0000 UTC m=+20.685996568" watchObservedRunningTime="2026-04-17 09:11:27.361213205 +0000 UTC m=+20.686031230" Apr 17 09:11:27.377833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:27.377801 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8xtb4" podStartSLOduration=3.342618895 podStartE2EDuration="20.377790063s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.926876132 +0000 UTC m=+3.251694103" lastFinishedPulling="2026-04-17 09:11:26.962047313 +0000 UTC m=+20.286865271" observedRunningTime="2026-04-17 09:11:27.377764027 +0000 UTC m=+20.702582004" watchObservedRunningTime="2026-04-17 09:11:27.377790063 +0000 UTC m=+20.702608039" Apr 17 09:11:28.329016 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.328719 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nv4m" event={"ID":"528b8329-d2cd-4d99-8a3e-62f7af17b361","Type":"ContainerStarted","Data":"264bfdbd024b29baaa5a2dbf23e3fcbb05731b0128674ca28150c26d16e0ea37"} Apr 17 09:11:28.331715 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.331685 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"e552c4a8a52cca9a83944b09bb0d9695cba385ebad122f7974963ac6d102d11d"} Apr 17 09:11:28.331845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.331722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"e59b131fcf3ec60ff49e946338668eed13759f651f0c60db649045e860fabae2"} Apr 17 09:11:28.331845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.331735 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"e896a0f9e69d60acf0ba32b70013f512c88b1d07675585eca5f8d3c8a7fa878f"} Apr 17 09:11:28.331845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.331746 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"e2738c177820454f00255fd06f02a1d594189fe546c09b7ea18f612f0e9e7e89"} Apr 17 09:11:28.331845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.331756 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"c2f0d3eb9f8ac178db4f34d804ebece3fc9c760c9e9b2e14ac00f7907744f3d1"} Apr 17 09:11:28.331845 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.331770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"651bfc327c5af2192c9fbcc09786f41d58662df4a925ba94b1a9789a71d405f9"} Apr 17 09:11:28.333167 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.333119 2567 generic.go:358] "Generic (PLEG): container finished" podID="a0d17817-9cb4-4adc-9cb1-ace0055c7639" containerID="df34ec9ec8d0fa391f6f9bbddf85aab25c2805a398e10481d3bf0ff7f3810240" exitCode=0 Apr 17 09:11:28.333279 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.333168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerDied","Data":"df34ec9ec8d0fa391f6f9bbddf85aab25c2805a398e10481d3bf0ff7f3810240"} Apr 17 09:11:28.350222 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.350184 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gcm2d" podStartSLOduration=4.324152306 podStartE2EDuration="21.350172102s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.935938705 +0000 UTC m=+3.260756672" lastFinishedPulling="2026-04-17 09:11:26.961958507 +0000 UTC m=+20.286776468" observedRunningTime="2026-04-17 09:11:27.401458266 +0000 UTC m=+20.726276241" watchObservedRunningTime="2026-04-17 09:11:28.350172102 +0000 UTC m=+21.674990077" Apr 17 09:11:28.350436 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.350416 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6nv4m" podStartSLOduration=4.029478402 podStartE2EDuration="21.350411798s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.933976826 +0000 UTC m=+3.258794779" lastFinishedPulling="2026-04-17 09:11:27.254910207 +0000 UTC m=+20.579728175" observedRunningTime="2026-04-17 09:11:28.348515392 +0000 UTC m=+21.673333368" watchObservedRunningTime="2026-04-17 09:11:28.350411798 +0000 UTC m=+21.675229773" Apr 17 09:11:28.582441 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:28.582390 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 09:11:29.153381 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.153274 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T09:11:28.582409099Z","UUID":"85f44299-c529-47b7-ae7d-016e43729753","Handler":null,"Name":"","Endpoint":""} Apr 17 09:11:29.158553 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.157098 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 09:11:29.158553 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.157130 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 09:11:29.248915 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.248828 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:29.248915 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.248855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:29.248915 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.248886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:29.249192 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:29.248956 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:29.249192 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:29.249009 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:29.249192 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:29.249104 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:29.337180 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.337104 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2jtdp" event={"ID":"39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a","Type":"ContainerStarted","Data":"3a13ee22a92b267dd3baf92f4af44db1b9c29865a6b184b4bb167d43210b733a"} Apr 17 09:11:29.339074 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.339044 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" event={"ID":"52e471a9-821a-4d18-b636-4a8e2b41a8bc","Type":"ContainerStarted","Data":"b8b7bd92dc0875c4b9cb7fcbeb07b70c5c380da6bbfef05a6fe3887c52e4b7a2"} Apr 17 09:11:29.353357 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:29.353307 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2jtdp" podStartSLOduration=5.314009368 podStartE2EDuration="22.353292367s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.927979281 +0000 UTC m=+3.252797239" lastFinishedPulling="2026-04-17 09:11:26.967262281 +0000 UTC m=+20.292080238" observedRunningTime="2026-04-17 09:11:29.352995184 +0000 UTC m=+22.677813160" watchObservedRunningTime="2026-04-17 09:11:29.353292367 +0000 UTC m=+22.678110346" Apr 17 09:11:30.344187 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:30.343949 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"c600852b6d2e0069f0bb1f35e4c15f6fa2098a1d257dcc9ed6afcd41234076af"} Apr 17 09:11:30.346032 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:30.346001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" event={"ID":"52e471a9-821a-4d18-b636-4a8e2b41a8bc","Type":"ContainerStarted","Data":"484415b0dbae09e7b67af7e82621e28bde1f275b8c740c318ab2f1bc525975d0"} Apr 17 09:11:30.365909 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:30.365833 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jfrbg" podStartSLOduration=3.879818184 podStartE2EDuration="23.365819583s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.925740896 +0000 UTC m=+3.250558862" lastFinishedPulling="2026-04-17 09:11:29.411742305 +0000 UTC m=+22.736560261" observedRunningTime="2026-04-17 09:11:30.365283977 +0000 UTC m=+23.690101953" watchObservedRunningTime="2026-04-17 09:11:30.365819583 +0000 UTC m=+23.690637558" Apr 17 09:11:30.598948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:30.598920 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:30.599531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:30.599512 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:31.248149 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:31.248060 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:31.248149 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:31.248079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:31.248376 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:31.248060 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:31.248376 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:31.248203 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:31.248376 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:31.248359 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:31.248505 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:31.248469 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:31.675187 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:31.675083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:31.675820 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:31.675288 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:31.675820 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:31.675360 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret podName:cad943ff-ebd7-4dca-aac3-600408e2153a nodeName:}" failed. No retries permitted until 2026-04-17 09:11:47.675338235 +0000 UTC m=+41.000156188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret") pod "global-pull-secret-syncer-vqkm2" (UID: "cad943ff-ebd7-4dca-aac3-600408e2153a") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:11:32.916896 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:32.916700 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:32.917521 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:32.917010 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 09:11:32.917521 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:32.917366 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gcm2d" Apr 17 09:11:33.247617 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.247590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:33.247764 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:33.247704 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:33.247803 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.247782 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:33.247897 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:33.247879 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:33.247929 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.247922 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:33.248017 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:33.247990 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:33.355098 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.355065 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" event={"ID":"5f467c32-67b8-4e0a-b835-8b0933d2cc02","Type":"ContainerStarted","Data":"b5ad681bee27f1958564c90ae5d220764ce32cdd08b4e0c1cc210cda8c09043c"} Apr 17 09:11:33.355560 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.355527 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:33.355560 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.355563 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:33.357596 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.357573 2567 generic.go:358] "Generic (PLEG): container finished" podID="a0d17817-9cb4-4adc-9cb1-ace0055c7639" containerID="742fbf6de9d9c3496822b0ef440dd7540485d0c38d12aee43d836bf0b69b79fd" exitCode=0 Apr 17 09:11:33.357790 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.357717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerDied","Data":"742fbf6de9d9c3496822b0ef440dd7540485d0c38d12aee43d836bf0b69b79fd"} Apr 17 09:11:33.371212 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.371192 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:33.371294 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.371256 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:33.382808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:33.382771 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" podStartSLOduration=8.581718932 podStartE2EDuration="26.382760202s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.936757459 +0000 UTC m=+3.261575427" lastFinishedPulling="2026-04-17 09:11:27.737798739 +0000 UTC m=+21.062616697" observedRunningTime="2026-04-17 09:11:33.38227275 +0000 UTC m=+26.707090726" watchObservedRunningTime="2026-04-17 09:11:33.382760202 +0000 UTC m=+26.707578181" Apr 17 09:11:34.360806 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.360541 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qpnfd"] Apr 17 09:11:34.361277 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.360859 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:34.361277 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:34.360967 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:34.361277 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.361207 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vqkm2"] Apr 17 09:11:34.361455 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.361307 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:34.361455 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:34.361402 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:34.362842 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.362365 2567 generic.go:358] "Generic (PLEG): container finished" podID="a0d17817-9cb4-4adc-9cb1-ace0055c7639" containerID="5aa298ef97d31c86e0525547f173d7a2f7a7f5ad3a2a7213477d38098bdb0825" exitCode=0 Apr 17 09:11:34.362842 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.362442 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerDied","Data":"5aa298ef97d31c86e0525547f173d7a2f7a7f5ad3a2a7213477d38098bdb0825"} Apr 17 09:11:34.362842 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.362598 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 09:11:34.363061 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.362877 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qqb8r"] Apr 17 09:11:34.363061 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:34.363013 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:34.363617 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:34.363107 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:35.365966 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:35.365886 2567 generic.go:358] "Generic (PLEG): container finished" podID="a0d17817-9cb4-4adc-9cb1-ace0055c7639" containerID="fe2087e0ff062e16cf579bc4ae7593dd3c3554877493a865bd894cb058c5f00a" exitCode=0 Apr 17 09:11:35.366284 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:35.365975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerDied","Data":"fe2087e0ff062e16cf579bc4ae7593dd3c3554877493a865bd894cb058c5f00a"} Apr 17 09:11:35.366284 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:35.366206 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 09:11:36.248339 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:36.248236 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:36.248339 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:36.248269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:36.248339 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:36.248269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:36.248780 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:36.248744 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:36.248987 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:36.248968 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:36.251459 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:36.249168 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:36.861999 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:36.861969 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:11:38.248238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:38.248208 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:38.248238 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:38.248231 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:38.248932 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:38.248266 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:38.248932 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:38.248377 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:11:38.248932 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:38.248433 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqkm2" podUID="cad943ff-ebd7-4dca-aac3-600408e2153a" Apr 17 09:11:38.248932 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:38.248473 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqb8r" podUID="0b55fe37-b595-4cdf-a226-39f50d91d206" Apr 17 09:11:40.022698 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.022622 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-212.ec2.internal" event="NodeReady" Apr 17 09:11:40.023126 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.022792 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 09:11:40.063677 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.063645 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x7pf8"] Apr 17 09:11:40.093738 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.093710 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4m6ml"] Apr 17 09:11:40.093901 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.093884 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.096939 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.096911 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 09:11:40.097065 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.096979 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 09:11:40.097065 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.097016 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hv2j9\"" Apr 17 09:11:40.114402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.114384 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x7pf8"] Apr 17 09:11:40.114526 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.114419 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4m6ml"] Apr 17 09:11:40.114526 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.114433 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.117465 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.117447 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hdtpc\"" Apr 17 09:11:40.117551 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.117499 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 09:11:40.117551 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.117541 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 09:11:40.117696 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.117680 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 09:11:40.245453 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.245423 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03aa1efb-f86a-42c4-b326-2a97e8287120-tmp-dir\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.245453 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.245462 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03aa1efb-f86a-42c4-b326-2a97e8287120-config-volume\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.245674 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.245488 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6sgs\" (UniqueName: \"kubernetes.io/projected/f7e6e539-32e6-4df0-9447-66f765e64434-kube-api-access-q6sgs\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.245674 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.245579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.245674 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.245615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.245793 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.245694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r244\" (UniqueName: \"kubernetes.io/projected/03aa1efb-f86a-42c4-b326-2a97e8287120-kube-api-access-2r244\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.248478 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.248447 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:40.248478 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.248462 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:40.248654 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.248449 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:40.251917 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.251860 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:11:40.252126 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.252107 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:11:40.252126 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.252107 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbqzc\"" Apr 17 09:11:40.252313 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.252115 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:11:40.252313 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.252228 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 09:11:40.252454 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.252440 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7jx4l\"" Apr 17 09:11:40.346479 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r244\" (UniqueName: \"kubernetes.io/projected/03aa1efb-f86a-42c4-b326-2a97e8287120-kube-api-access-2r244\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.346630 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346481 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03aa1efb-f86a-42c4-b326-2a97e8287120-tmp-dir\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.346630 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03aa1efb-f86a-42c4-b326-2a97e8287120-config-volume\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.346630 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6sgs\" (UniqueName: \"kubernetes.io/projected/f7e6e539-32e6-4df0-9447-66f765e64434-kube-api-access-q6sgs\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.346775 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.346775 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.346872 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.346769 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/03aa1efb-f86a-42c4-b326-2a97e8287120-tmp-dir\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.346872 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.346795 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:40.346872 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.346823 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:40.346872 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.346868 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.846846503 +0000 UTC m=+34.171664470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:11:40.347044 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.346888 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:40.846878446 +0000 UTC m=+34.171696411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:11:40.357558 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.357535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r244\" (UniqueName: \"kubernetes.io/projected/03aa1efb-f86a-42c4-b326-2a97e8287120-kube-api-access-2r244\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.357766 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.357739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6sgs\" (UniqueName: \"kubernetes.io/projected/f7e6e539-32e6-4df0-9447-66f765e64434-kube-api-access-q6sgs\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.357885 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.357854 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03aa1efb-f86a-42c4-b326-2a97e8287120-config-volume\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.850585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.850548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:40.850585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.850589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.850648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.850713 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.850764 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.850769 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.850792 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:41.850771808 +0000 UTC m=+35.175589768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.850815 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:12:12.850805502 +0000 UTC m=+66.175623457 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : secret "metrics-daemon-secret" not found Apr 17 09:11:40.850882 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:40.850830 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:41.850822293 +0000 UTC m=+35.175640261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:11:40.951543 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.951505 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:40.954403 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:40.954380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzqk\" (UniqueName: \"kubernetes.io/projected/0b55fe37-b595-4cdf-a226-39f50d91d206-kube-api-access-qtzqk\") pod \"network-check-target-qqb8r\" (UID: \"0b55fe37-b595-4cdf-a226-39f50d91d206\") " pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:41.166364 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:41.166284 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:41.697126 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:41.696955 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qqb8r"] Apr 17 09:11:41.701042 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:41.701011 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b55fe37_b595_4cdf_a226_39f50d91d206.slice/crio-e9f9332d30b0fde7767a356f1ac9c900cbd7dca2a624cfe8c7e916c78ceab08e WatchSource:0}: Error finding container e9f9332d30b0fde7767a356f1ac9c900cbd7dca2a624cfe8c7e916c78ceab08e: Status 404 returned error can't find the container with id e9f9332d30b0fde7767a356f1ac9c900cbd7dca2a624cfe8c7e916c78ceab08e Apr 17 09:11:41.859361 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:41.859339 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:41.859454 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:41.859369 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:41.859519 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:41.859459 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:41.859519 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:41.859479 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:41.859519 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:41.859500 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:43.859488058 +0000 UTC m=+37.184306012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:11:41.859683 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:41.859534 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:43.85951646 +0000 UTC m=+37.184334430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:11:42.381565 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:42.381473 2567 generic.go:358] "Generic (PLEG): container finished" podID="a0d17817-9cb4-4adc-9cb1-ace0055c7639" containerID="b1e524aa43044ca7def517d33d96c6a97108964e84eafd54fee36d2a45bb4136" exitCode=0 Apr 17 09:11:42.382050 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:42.381567 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerDied","Data":"b1e524aa43044ca7def517d33d96c6a97108964e84eafd54fee36d2a45bb4136"} Apr 17 09:11:42.383265 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:42.383241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qqb8r" event={"ID":"0b55fe37-b595-4cdf-a226-39f50d91d206","Type":"ContainerStarted","Data":"e9f9332d30b0fde7767a356f1ac9c900cbd7dca2a624cfe8c7e916c78ceab08e"} Apr 17 09:11:43.388183 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:43.388126 2567 generic.go:358] "Generic (PLEG): container finished" podID="a0d17817-9cb4-4adc-9cb1-ace0055c7639" containerID="86de7e9b8659a73fd51b8990eb3d68722dbabef90d36f90e654410fb07346e68" exitCode=0 Apr 17 09:11:43.388570 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:43.388178 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerDied","Data":"86de7e9b8659a73fd51b8990eb3d68722dbabef90d36f90e654410fb07346e68"} Apr 17 09:11:43.876571 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:43.876361 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:43.876728 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:43.876596 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:43.876728 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:43.876477 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:43.876728 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:43.876701 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:47.876682581 +0000 UTC m=+41.201500537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:11:43.876879 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:43.876734 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:43.876879 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:43.876776 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:47.876765004 +0000 UTC m=+41.201582957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:11:44.392709 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:44.392678 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xswsv" event={"ID":"a0d17817-9cb4-4adc-9cb1-ace0055c7639","Type":"ContainerStarted","Data":"d4d62b941358285fa983562165903ac98f76d6f81d3e8db20aae2b67199a9cd9"} Apr 17 09:11:44.418182 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:44.418115 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xswsv" podStartSLOduration=5.499327942 podStartE2EDuration="37.418101412s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:09.932387734 +0000 UTC m=+3.257205688" lastFinishedPulling="2026-04-17 09:11:41.851161201 +0000 UTC m=+35.175979158" observedRunningTime="2026-04-17 09:11:44.4168067 +0000 UTC m=+37.741624676" watchObservedRunningTime="2026-04-17 09:11:44.418101412 +0000 UTC m=+37.742919387" Apr 17 09:11:45.395681 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:45.395648 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qqb8r" event={"ID":"0b55fe37-b595-4cdf-a226-39f50d91d206","Type":"ContainerStarted","Data":"b16f1de7d4ea3ba74261c1bcdec1bee69f563c0ca6b4535d19a3d05d2126acc4"} Apr 17 09:11:45.396120 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:45.395996 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:11:45.411118 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:45.411075 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qqb8r" podStartSLOduration=35.335823605 podStartE2EDuration="38.411061603s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:11:41.703148112 +0000 UTC m=+35.027966079" lastFinishedPulling="2026-04-17 09:11:44.778386115 +0000 UTC m=+38.103204077" observedRunningTime="2026-04-17 09:11:45.410916839 +0000 UTC m=+38.735734851" watchObservedRunningTime="2026-04-17 09:11:45.411061603 +0000 UTC m=+38.735879578" Apr 17 09:11:47.704524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:47.704483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:47.708405 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:47.708379 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cad943ff-ebd7-4dca-aac3-600408e2153a-original-pull-secret\") pod \"global-pull-secret-syncer-vqkm2\" (UID: \"cad943ff-ebd7-4dca-aac3-600408e2153a\") " pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:47.759302 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:47.759273 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqkm2" Apr 17 09:11:47.877844 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:47.877813 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vqkm2"] Apr 17 09:11:47.887510 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:11:47.887476 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad943ff_ebd7_4dca_aac3_600408e2153a.slice/crio-aead9c40cd8e12c35dec97888c8db35bcda1906a44c3303dfc3ca97e2136d79b WatchSource:0}: Error finding container aead9c40cd8e12c35dec97888c8db35bcda1906a44c3303dfc3ca97e2136d79b: Status 404 returned error can't find the container with id aead9c40cd8e12c35dec97888c8db35bcda1906a44c3303dfc3ca97e2136d79b Apr 17 09:11:47.906362 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:47.906337 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:47.906449 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:47.906368 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:47.906501 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:47.906453 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:47.906501 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:47.906477 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:47.906501 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:47.906495 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:55.906481718 +0000 UTC m=+49.231299672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:11:47.906605 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:47.906520 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:11:55.906507554 +0000 UTC m=+49.231325508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:11:48.401354 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:48.401310 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vqkm2" event={"ID":"cad943ff-ebd7-4dca-aac3-600408e2153a","Type":"ContainerStarted","Data":"aead9c40cd8e12c35dec97888c8db35bcda1906a44c3303dfc3ca97e2136d79b"} Apr 17 09:11:52.410185 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:52.410079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vqkm2" event={"ID":"cad943ff-ebd7-4dca-aac3-600408e2153a","Type":"ContainerStarted","Data":"ee73391253c0c942b64d2699dc9103b39d6b8f29737088a80c05c65f8fa61dfb"} Apr 17 09:11:52.425416 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:52.425364 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vqkm2" podStartSLOduration=33.265332728 podStartE2EDuration="37.425347122s" podCreationTimestamp="2026-04-17 09:11:15 +0000 UTC" firstStartedPulling="2026-04-17 09:11:47.889171343 +0000 UTC m=+41.213989297" lastFinishedPulling="2026-04-17 09:11:52.049185733 +0000 UTC m=+45.374003691" observedRunningTime="2026-04-17 09:11:52.42488494 +0000 UTC m=+45.749702918" watchObservedRunningTime="2026-04-17 09:11:52.425347122 +0000 UTC m=+45.750165077" Apr 17 09:11:55.964569 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:55.964530 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:11:55.964569 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:11:55.964571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:11:55.964963 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:55.964675 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:11:55.964963 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:55.964741 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:11.964726954 +0000 UTC m=+65.289544907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:11:55.964963 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:55.964678 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:11:55.964963 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:11:55.964806 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:11.96479403 +0000 UTC m=+65.289611983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:12:06.872632 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:06.872603 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmpm" Apr 17 09:12:11.971802 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:11.971763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:12:11.971802 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:11.971814 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:12:11.972314 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:11.971921 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:11.972314 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:11.972010 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:43.971993496 +0000 UTC m=+97.296811450 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:12:11.972314 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:11.971922 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:11.972314 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:11.972089 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:12:43.972071128 +0000 UTC m=+97.296889101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:12:12.877049 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:12.877011 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:12:12.877259 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:12.877200 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:12:12.877335 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:12.877278 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:13:16.87725832 +0000 UTC m=+130.202076277 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : secret "metrics-daemon-secret" not found Apr 17 09:12:17.401778 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:17.401750 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qqb8r" Apr 17 09:12:43.980429 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:43.980399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:12:43.980775 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:12:43.980439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:12:43.980775 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:43.980530 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:12:43.980775 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:43.980539 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:12:43.980775 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:43.980594 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert podName:f7e6e539-32e6-4df0-9447-66f765e64434 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:47.980576865 +0000 UTC m=+161.305394822 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert") pod "ingress-canary-4m6ml" (UID: "f7e6e539-32e6-4df0-9447-66f765e64434") : secret "canary-serving-cert" not found Apr 17 09:12:43.980775 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:12:43.980607 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls podName:03aa1efb-f86a-42c4-b326-2a97e8287120 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:47.980601462 +0000 UTC m=+161.305419416 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls") pod "dns-default-x7pf8" (UID: "03aa1efb-f86a-42c4-b326-2a97e8287120") : secret "dns-default-metrics-tls" not found Apr 17 09:13:13.109065 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.109031 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p"] Apr 17 09:13:13.111796 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.111777 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" Apr 17 09:13:13.114628 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.114610 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 09:13:13.114695 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.114612 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:13:13.115764 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.115750 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6q9qg\"" Apr 17 09:13:13.121091 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.121071 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p"] Apr 17 09:13:13.223129 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.223106 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wgk7d"] Apr 17 09:13:13.225892 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.225874 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.236808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.236783 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 09:13:13.236808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.236801 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:13:13.236964 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.236802 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-gnjvz\"" Apr 17 09:13:13.237511 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.237494 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7f87cdbc66-wpc5d"] Apr 17 09:13:13.238351 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.238335 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 09:13:13.240070 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.240056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.243077 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243052 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 09:13:13.243177 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243094 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-z5bmb\"" Apr 17 09:13:13.243230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243183 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 09:13:13.243568 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243556 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 09:13:13.243729 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243717 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 09:13:13.243776 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243761 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 09:13:13.243836 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.243820 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 09:13:13.252916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.252897 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl"] Apr 17 09:13:13.255678 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.255661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.257922 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.257898 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wgk7d"] Apr 17 09:13:13.259539 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.259522 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-dxfjf\"" Apr 17 09:13:13.259632 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.259545 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 09:13:13.261070 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.261049 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:13:13.262922 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.262904 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 09:13:13.269569 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.269551 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tq54\" (UniqueName: \"kubernetes.io/projected/9b0e3808-bbdf-40aa-aa58-f60d8dae2657-kube-api-access-7tq54\") pod \"volume-data-source-validator-7c6cbb6c87-njz5p\" (UID: \"9b0e3808-bbdf-40aa-aa58-f60d8dae2657\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" Apr 17 09:13:13.269842 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.269827 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 09:13:13.271870 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.271849 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7f87cdbc66-wpc5d"] Apr 17 09:13:13.280276 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.280256 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 09:13:13.291612 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.291591 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl"] Apr 17 09:13:13.370553 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370491 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.370553 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370521 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6mb\" (UniqueName: \"kubernetes.io/projected/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-kube-api-access-gx6mb\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.370713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02e41852-3045-415d-bbb7-fc08dc3cfe5f-trusted-ca\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.370713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.370713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5cbj\" (UniqueName: \"kubernetes.io/projected/02e41852-3045-415d-bbb7-fc08dc3cfe5f-kube-api-access-d5cbj\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.370713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370605 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.370713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tq54\" (UniqueName: \"kubernetes.io/projected/9b0e3808-bbdf-40aa-aa58-f60d8dae2657-kube-api-access-7tq54\") pod \"volume-data-source-validator-7c6cbb6c87-njz5p\" (UID: \"9b0e3808-bbdf-40aa-aa58-f60d8dae2657\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" Apr 17 09:13:13.370713 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e41852-3045-415d-bbb7-fc08dc3cfe5f-config\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.370976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370710 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwk7l\" (UniqueName: \"kubernetes.io/projected/70002831-91b1-409a-932e-a0ca4d141e25-kube-api-access-qwk7l\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.370976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e41852-3045-415d-bbb7-fc08dc3cfe5f-serving-cert\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.370976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-default-certificate\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.370976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.370892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-stats-auth\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.378517 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.378499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tq54\" (UniqueName: \"kubernetes.io/projected/9b0e3808-bbdf-40aa-aa58-f60d8dae2657-kube-api-access-7tq54\") pod \"volume-data-source-validator-7c6cbb6c87-njz5p\" (UID: \"9b0e3808-bbdf-40aa-aa58-f60d8dae2657\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" Apr 17 09:13:13.420559 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.420536 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" Apr 17 09:13:13.471478 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471451 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-default-certificate\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.471598 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-stats-auth\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.471598 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471536 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.471598 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6mb\" (UniqueName: \"kubernetes.io/projected/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-kube-api-access-gx6mb\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.471757 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02e41852-3045-415d-bbb7-fc08dc3cfe5f-trusted-ca\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.471757 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.471757 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471672 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5cbj\" (UniqueName: \"kubernetes.io/projected/02e41852-3045-415d-bbb7-fc08dc3cfe5f-kube-api-access-d5cbj\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.471757 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471705 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.471757 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e41852-3045-415d-bbb7-fc08dc3cfe5f-config\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.471977 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwk7l\" (UniqueName: \"kubernetes.io/projected/70002831-91b1-409a-932e-a0ca4d141e25-kube-api-access-qwk7l\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.471977 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.471790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e41852-3045-415d-bbb7-fc08dc3cfe5f-serving-cert\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.472233 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.472206 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:13:13.472321 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.472290 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:13.972265228 +0000 UTC m=+127.297083199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : secret "router-metrics-certs-default" not found Apr 17 09:13:13.473158 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.472722 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 09:13:13.473158 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.472796 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls podName:70002831-91b1-409a-932e-a0ca4d141e25 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:13.972778102 +0000 UTC m=+127.297596061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4wscl" (UID: "70002831-91b1-409a-932e-a0ca4d141e25") : secret "samples-operator-tls" not found Apr 17 09:13:13.473158 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.473112 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:13.973097565 +0000 UTC m=+127.297915543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : configmap references non-existent config key: service-ca.crt Apr 17 09:13:13.473357 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.473162 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02e41852-3045-415d-bbb7-fc08dc3cfe5f-trusted-ca\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.473357 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.473272 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e41852-3045-415d-bbb7-fc08dc3cfe5f-config\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.475250 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.475226 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-stats-auth\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.476493 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.476467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-default-certificate\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.478337 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.478318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e41852-3045-415d-bbb7-fc08dc3cfe5f-serving-cert\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.483253 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.483229 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5cbj\" (UniqueName: \"kubernetes.io/projected/02e41852-3045-415d-bbb7-fc08dc3cfe5f-kube-api-access-d5cbj\") pod \"console-operator-9d4b6777b-wgk7d\" (UID: \"02e41852-3045-415d-bbb7-fc08dc3cfe5f\") " pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.483952 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.483932 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6mb\" (UniqueName: \"kubernetes.io/projected/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-kube-api-access-gx6mb\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.484870 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.484850 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwk7l\" (UniqueName: \"kubernetes.io/projected/70002831-91b1-409a-932e-a0ca4d141e25-kube-api-access-qwk7l\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.531466 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.531406 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p"] Apr 17 09:13:13.533944 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.533923 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:13.535014 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:13.534992 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0e3808_bbdf_40aa_aa58_f60d8dae2657.slice/crio-61479f1537b4846ae14222721f08a6f7a09099c3aa342a3fa8e31905423c525d WatchSource:0}: Error finding container 61479f1537b4846ae14222721f08a6f7a09099c3aa342a3fa8e31905423c525d: Status 404 returned error can't find the container with id 61479f1537b4846ae14222721f08a6f7a09099c3aa342a3fa8e31905423c525d Apr 17 09:13:13.561154 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.561107 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" event={"ID":"9b0e3808-bbdf-40aa-aa58-f60d8dae2657","Type":"ContainerStarted","Data":"61479f1537b4846ae14222721f08a6f7a09099c3aa342a3fa8e31905423c525d"} Apr 17 09:13:13.646441 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.646375 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-wgk7d"] Apr 17 09:13:13.649351 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:13.649325 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e41852_3045_415d_bbb7_fc08dc3cfe5f.slice/crio-f8a8246548338444437fa29b47d6c846869d232dffa802e328f1a4c109094a7a WatchSource:0}: Error finding container f8a8246548338444437fa29b47d6c846869d232dffa802e328f1a4c109094a7a: Status 404 returned error can't find the container with id f8a8246548338444437fa29b47d6c846869d232dffa802e328f1a4c109094a7a Apr 17 09:13:13.975972 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.975939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.976171 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.975978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:13.976171 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.976093 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 09:13:13.976171 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.976107 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:14.97608918 +0000 UTC m=+128.300907134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : configmap references non-existent config key: service-ca.crt Apr 17 09:13:13.976171 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.976169 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls podName:70002831-91b1-409a-932e-a0ca4d141e25 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:14.976153119 +0000 UTC m=+128.300971087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4wscl" (UID: "70002831-91b1-409a-932e-a0ca4d141e25") : secret "samples-operator-tls" not found Apr 17 09:13:13.976358 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:13.976196 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:13.976358 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.976290 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:13:13.976358 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:13.976320 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:14.976312022 +0000 UTC m=+128.301129977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : secret "router-metrics-certs-default" not found Apr 17 09:13:14.564813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:14.564776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" event={"ID":"02e41852-3045-415d-bbb7-fc08dc3cfe5f","Type":"ContainerStarted","Data":"f8a8246548338444437fa29b47d6c846869d232dffa802e328f1a4c109094a7a"} Apr 17 09:13:14.985433 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:14.985391 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:14.985601 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:14.985451 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:14.985601 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:14.985543 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:14.985601 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:14.985556 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:16.985538418 +0000 UTC m=+130.310356396 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : configmap references non-existent config key: service-ca.crt Apr 17 09:13:14.985793 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:14.985611 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 09:13:14.985793 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:14.985625 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:13:14.985793 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:14.985679 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:16.985663403 +0000 UTC m=+130.310481361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : secret "router-metrics-certs-default" not found Apr 17 09:13:14.985793 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:14.985696 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls podName:70002831-91b1-409a-932e-a0ca4d141e25 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:16.985687576 +0000 UTC m=+130.310505529 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4wscl" (UID: "70002831-91b1-409a-932e-a0ca4d141e25") : secret "samples-operator-tls" not found Apr 17 09:13:15.568413 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:15.568390 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/0.log" Apr 17 09:13:15.568813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:15.568429 2567 generic.go:358] "Generic (PLEG): container finished" podID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" containerID="1f51cd0c3656d6c37a1a1c2a268e3bfc8e52b1f25a43714ffdacdb32484ef71d" exitCode=255 Apr 17 09:13:15.568813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:15.568496 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" event={"ID":"02e41852-3045-415d-bbb7-fc08dc3cfe5f","Type":"ContainerDied","Data":"1f51cd0c3656d6c37a1a1c2a268e3bfc8e52b1f25a43714ffdacdb32484ef71d"} Apr 17 09:13:15.568813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:15.568757 2567 scope.go:117] "RemoveContainer" containerID="1f51cd0c3656d6c37a1a1c2a268e3bfc8e52b1f25a43714ffdacdb32484ef71d" Apr 17 09:13:15.569803 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:15.569782 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" event={"ID":"9b0e3808-bbdf-40aa-aa58-f60d8dae2657","Type":"ContainerStarted","Data":"646b2cf572795e8ea6c2ab2073c7183c6dbb6a7b354c56a5b7b4229097fdb966"} Apr 17 09:13:15.598967 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:15.598884 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-njz5p" podStartSLOduration=0.725717416 podStartE2EDuration="2.598869882s" podCreationTimestamp="2026-04-17 09:13:13 +0000 UTC" firstStartedPulling="2026-04-17 09:13:13.537493104 +0000 UTC m=+126.862311057" lastFinishedPulling="2026-04-17 09:13:15.410645568 +0000 UTC m=+128.735463523" observedRunningTime="2026-04-17 09:13:15.598657116 +0000 UTC m=+128.923475093" watchObservedRunningTime="2026-04-17 09:13:15.598869882 +0000 UTC m=+128.923687857" Apr 17 09:13:16.573169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.573124 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/1.log" Apr 17 09:13:16.573524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.573492 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/0.log" Apr 17 09:13:16.573584 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.573525 2567 generic.go:358] "Generic (PLEG): container finished" podID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" containerID="4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd" exitCode=255 Apr 17 09:13:16.573640 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.573616 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" event={"ID":"02e41852-3045-415d-bbb7-fc08dc3cfe5f","Type":"ContainerDied","Data":"4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd"} Apr 17 09:13:16.573691 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.573666 2567 scope.go:117] "RemoveContainer" containerID="1f51cd0c3656d6c37a1a1c2a268e3bfc8e52b1f25a43714ffdacdb32484ef71d" Apr 17 09:13:16.573945 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.573921 2567 scope.go:117] "RemoveContainer" containerID="4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd" Apr 17 09:13:16.574105 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.574084 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wgk7d_openshift-console-operator(02e41852-3045-415d-bbb7-fc08dc3cfe5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podUID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" Apr 17 09:13:16.897889 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.897795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:13:16.898037 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.897916 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:13:16.898037 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.897974 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs podName:5029b845-d556-4306-b1bb-4c6373b7e4be nodeName:}" failed. No retries permitted until 2026-04-17 09:15:18.897958285 +0000 UTC m=+252.222776244 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs") pod "network-metrics-daemon-qpnfd" (UID: "5029b845-d556-4306-b1bb-4c6373b7e4be") : secret "metrics-daemon-secret" not found Apr 17 09:13:16.998573 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.998532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:16.998573 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.998577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:16.998815 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:16.998634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:16.998815 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.998706 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:20.998685766 +0000 UTC m=+134.323503746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : configmap references non-existent config key: service-ca.crt Apr 17 09:13:16.998815 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.998728 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:13:16.998815 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.998773 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:20.998762173 +0000 UTC m=+134.323580128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : secret "router-metrics-certs-default" not found Apr 17 09:13:16.998815 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.998728 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 09:13:16.998815 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:16.998803 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls podName:70002831-91b1-409a-932e-a0ca4d141e25 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:20.998794635 +0000 UTC m=+134.323612612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4wscl" (UID: "70002831-91b1-409a-932e-a0ca4d141e25") : secret "samples-operator-tls" not found Apr 17 09:13:17.026615 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.026586 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6"] Apr 17 09:13:17.029579 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.029563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.033116 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.033097 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 09:13:17.033217 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.033196 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 09:13:17.033217 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.033209 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 09:13:17.033418 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.033196 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:13:17.033958 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.033941 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-s8gpx\"" Apr 17 09:13:17.045091 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.045066 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6"] Apr 17 09:13:17.098983 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.098959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6tld\" (UniqueName: \"kubernetes.io/projected/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-kube-api-access-f6tld\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.099072 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.098987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.099072 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.099009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.199556 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.199530 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.199655 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.199561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.199655 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.199641 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6tld\" (UniqueName: \"kubernetes.io/projected/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-kube-api-access-f6tld\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.200051 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.200034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.201689 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.201668 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.208990 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.208969 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6tld\" (UniqueName: \"kubernetes.io/projected/fb2625d8-aa03-43bc-a98b-f09bcdd8bf65-kube-api-access-f6tld\") pod \"kube-storage-version-migrator-operator-6769c5d45-bjjz6\" (UID: \"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.337636 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.337611 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" Apr 17 09:13:17.449314 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.449288 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6"] Apr 17 09:13:17.452147 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:17.452094 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2625d8_aa03_43bc_a98b_f09bcdd8bf65.slice/crio-c9d94aab81662f8fc77fafed31f06a1d737a3a907096acdc994f4c9940a34190 WatchSource:0}: Error finding container c9d94aab81662f8fc77fafed31f06a1d737a3a907096acdc994f4c9940a34190: Status 404 returned error can't find the container with id c9d94aab81662f8fc77fafed31f06a1d737a3a907096acdc994f4c9940a34190 Apr 17 09:13:17.576983 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.576959 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/1.log" Apr 17 09:13:17.577455 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.577307 2567 scope.go:117] "RemoveContainer" containerID="4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd" Apr 17 09:13:17.577512 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:17.577472 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wgk7d_openshift-console-operator(02e41852-3045-415d-bbb7-fc08dc3cfe5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podUID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" Apr 17 09:13:17.578060 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:17.578029 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" event={"ID":"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65","Type":"ContainerStarted","Data":"c9d94aab81662f8fc77fafed31f06a1d737a3a907096acdc994f4c9940a34190"} Apr 17 09:13:18.307645 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.307609 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q"] Apr 17 09:13:18.310670 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.310647 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" Apr 17 09:13:18.313075 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.313047 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lnhr5\"" Apr 17 09:13:18.320189 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.320170 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q"] Apr 17 09:13:18.408773 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.408733 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlr8\" (UniqueName: \"kubernetes.io/projected/014b9751-7912-4f44-a712-4b927def575d-kube-api-access-9tlr8\") pod \"network-check-source-8894fc9bd-jxg5q\" (UID: \"014b9751-7912-4f44-a712-4b927def575d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" Apr 17 09:13:18.509230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.509200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlr8\" (UniqueName: \"kubernetes.io/projected/014b9751-7912-4f44-a712-4b927def575d-kube-api-access-9tlr8\") pod \"network-check-source-8894fc9bd-jxg5q\" (UID: \"014b9751-7912-4f44-a712-4b927def575d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" Apr 17 09:13:18.517838 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.517810 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlr8\" (UniqueName: \"kubernetes.io/projected/014b9751-7912-4f44-a712-4b927def575d-kube-api-access-9tlr8\") pod \"network-check-source-8894fc9bd-jxg5q\" (UID: \"014b9751-7912-4f44-a712-4b927def575d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" Apr 17 09:13:18.622451 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.622384 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" Apr 17 09:13:18.731562 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:18.731536 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q"] Apr 17 09:13:18.734579 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:18.734549 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod014b9751_7912_4f44_a712_4b927def575d.slice/crio-c20814fa628bc13cef7187a3d039c008b04e2ae202011336d70b4a63865c5571 WatchSource:0}: Error finding container c20814fa628bc13cef7187a3d039c008b04e2ae202011336d70b4a63865c5571: Status 404 returned error can't find the container with id c20814fa628bc13cef7187a3d039c008b04e2ae202011336d70b4a63865c5571 Apr 17 09:13:19.083831 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:19.083802 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8xtb4_4f93c6af-eabd-453e-9966-3199a8d4a534/dns-node-resolver/0.log" Apr 17 09:13:19.587413 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:19.587376 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" event={"ID":"014b9751-7912-4f44-a712-4b927def575d","Type":"ContainerStarted","Data":"d73cce86d17305eff9f10bfdea7807a829af9dbfdca3df3c95d54f7955a563a7"} Apr 17 09:13:19.587413 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:19.587414 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" event={"ID":"014b9751-7912-4f44-a712-4b927def575d","Type":"ContainerStarted","Data":"c20814fa628bc13cef7187a3d039c008b04e2ae202011336d70b4a63865c5571"} Apr 17 09:13:19.588721 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:19.588695 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" event={"ID":"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65","Type":"ContainerStarted","Data":"f0e27f83f6c3234747b9cac3bbfd420b6a84b9a7fbc907df3b0bdb4359279e9e"} Apr 17 09:13:19.620984 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:19.620947 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" podStartSLOduration=0.664366192 podStartE2EDuration="2.620936966s" podCreationTimestamp="2026-04-17 09:13:17 +0000 UTC" firstStartedPulling="2026-04-17 09:13:17.453973563 +0000 UTC m=+130.778791517" lastFinishedPulling="2026-04-17 09:13:19.410544332 +0000 UTC m=+132.735362291" observedRunningTime="2026-04-17 09:13:19.620106843 +0000 UTC m=+132.944924824" watchObservedRunningTime="2026-04-17 09:13:19.620936966 +0000 UTC m=+132.945754942" Apr 17 09:13:19.621098 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:19.621014 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jxg5q" podStartSLOduration=1.621009676 podStartE2EDuration="1.621009676s" podCreationTimestamp="2026-04-17 09:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:13:19.60280882 +0000 UTC m=+132.927626796" watchObservedRunningTime="2026-04-17 09:13:19.621009676 +0000 UTC m=+132.945827660" Apr 17 09:13:20.083956 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:20.083930 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xx56q_8fda692b-3c65-4165-b88c-ab992a58a369/node-ca/0.log" Apr 17 09:13:21.029476 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:21.029439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:21.029631 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:21.029518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:21.029631 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:21.029539 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:21.029631 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:21.029581 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:13:21.029772 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:21.029633 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 09:13:21.029772 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:21.029646 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:29.029630574 +0000 UTC m=+142.354448528 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : secret "router-metrics-certs-default" not found Apr 17 09:13:21.029772 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:21.029671 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls podName:70002831-91b1-409a-932e-a0ca4d141e25 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:29.029660396 +0000 UTC m=+142.354478350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4wscl" (UID: "70002831-91b1-409a-932e-a0ca4d141e25") : secret "samples-operator-tls" not found Apr 17 09:13:21.029772 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:21.029684 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:29.029677789 +0000 UTC m=+142.354495742 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : configmap references non-existent config key: service-ca.crt Apr 17 09:13:23.534563 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:23.534530 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:23.534563 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:23.534572 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:23.535061 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:23.534884 2567 scope.go:117] "RemoveContainer" containerID="4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd" Apr 17 09:13:23.535061 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:23.535031 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-wgk7d_openshift-console-operator(02e41852-3045-415d-bbb7-fc08dc3cfe5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podUID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" Apr 17 09:13:29.090946 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:29.090908 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:29.090964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:29.090987 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:29.091085 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:29.091150 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:45.091118879 +0000 UTC m=+158.415936837 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : configmap references non-existent config key: service-ca.crt Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:29.091176 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs podName:1530bfd9-b6fd-487c-a92b-e509c50b4f9c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:45.091168026 +0000 UTC m=+158.415985981 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs") pod "router-default-7f87cdbc66-wpc5d" (UID: "1530bfd9-b6fd-487c-a92b-e509c50b4f9c") : secret "router-metrics-certs-default" not found Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:29.091093 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 09:13:29.091423 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:29.091205 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls podName:70002831-91b1-409a-932e-a0ca4d141e25 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:45.091200538 +0000 UTC m=+158.416018492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4wscl" (UID: "70002831-91b1-409a-932e-a0ca4d141e25") : secret "samples-operator-tls" not found Apr 17 09:13:37.249127 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.249095 2567 scope.go:117] "RemoveContainer" containerID="4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd" Apr 17 09:13:37.633967 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.633898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/2.log" Apr 17 09:13:37.634282 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.634266 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/1.log" Apr 17 09:13:37.634333 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.634299 2567 generic.go:358] "Generic (PLEG): container finished" podID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" containerID="ee2d8bc12d1cf8940dfd348ecbc5514aa023711aa839dfe4000912b2d071ae27" exitCode=255 Apr 17 09:13:37.634374 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.634359 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" event={"ID":"02e41852-3045-415d-bbb7-fc08dc3cfe5f","Type":"ContainerDied","Data":"ee2d8bc12d1cf8940dfd348ecbc5514aa023711aa839dfe4000912b2d071ae27"} Apr 17 09:13:37.634409 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.634393 2567 scope.go:117] "RemoveContainer" containerID="4a461a0833fea3bbc0a0579b9630932d2cf692f93f44e9272177534f7d15b0bd" Apr 17 09:13:37.634707 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:37.634687 2567 scope.go:117] "RemoveContainer" containerID="ee2d8bc12d1cf8940dfd348ecbc5514aa023711aa839dfe4000912b2d071ae27" Apr 17 09:13:37.634898 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:37.634879 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-wgk7d_openshift-console-operator(02e41852-3045-415d-bbb7-fc08dc3cfe5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podUID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" Apr 17 09:13:38.637560 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:38.637536 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/2.log" Apr 17 09:13:43.105279 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:43.105222 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-x7pf8" podUID="03aa1efb-f86a-42c4-b326-2a97e8287120" Apr 17 09:13:43.123189 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:43.123154 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4m6ml" podUID="f7e6e539-32e6-4df0-9447-66f765e64434" Apr 17 09:13:43.271788 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:43.271755 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qpnfd" podUID="5029b845-d556-4306-b1bb-4c6373b7e4be" Apr 17 09:13:43.534816 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:43.534785 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:43.534816 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:43.534823 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:13:43.535166 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:43.535153 2567 scope.go:117] "RemoveContainer" containerID="ee2d8bc12d1cf8940dfd348ecbc5514aa023711aa839dfe4000912b2d071ae27" Apr 17 09:13:43.535343 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:43.535327 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-wgk7d_openshift-console-operator(02e41852-3045-415d-bbb7-fc08dc3cfe5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podUID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" Apr 17 09:13:43.650755 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:43.650723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:13:43.650916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:43.650723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7pf8" Apr 17 09:13:44.632547 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.632507 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4m2cs"] Apr 17 09:13:44.637184 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.637162 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.642500 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.642481 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 09:13:44.642617 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.642484 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 09:13:44.642617 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.642524 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 09:13:44.642617 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.642545 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 09:13:44.642617 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.642544 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-f5jjb\"" Apr 17 09:13:44.647703 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.647685 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54d7d6bc55-jchg9"] Apr 17 09:13:44.652602 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.652489 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4m2cs"] Apr 17 09:13:44.652716 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.652635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.655533 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.655510 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 09:13:44.655632 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.655575 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9xfs\"" Apr 17 09:13:44.655632 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.655576 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 09:13:44.655632 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.655626 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 09:13:44.661423 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.661404 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 09:13:44.670853 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.670832 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54d7d6bc55-jchg9"] Apr 17 09:13:44.820640 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820617 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/69e56ba4-66ad-49fe-9d81-f96eb043eac7-data-volume\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.820804 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820661 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chndf\" (UniqueName: \"kubernetes.io/projected/69e56ba4-66ad-49fe-9d81-f96eb043eac7-kube-api-access-chndf\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.820804 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820691 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-bound-sa-token\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.820804 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820771 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86236ac0-ed67-423e-be76-0a7f6bcba48b-image-registry-private-configuration\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.820976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820810 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86236ac0-ed67-423e-be76-0a7f6bcba48b-installation-pull-secrets\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.820976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820838 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcn8x\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-kube-api-access-fcn8x\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.820976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-registry-tls\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.820976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820911 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/69e56ba4-66ad-49fe-9d81-f96eb043eac7-crio-socket\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.820976 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/69e56ba4-66ad-49fe-9d81-f96eb043eac7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.821233 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.820988 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86236ac0-ed67-423e-be76-0a7f6bcba48b-trusted-ca\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.821233 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.821019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86236ac0-ed67-423e-be76-0a7f6bcba48b-ca-trust-extracted\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.821233 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.821043 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86236ac0-ed67-423e-be76-0a7f6bcba48b-registry-certificates\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.821233 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.821087 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/69e56ba4-66ad-49fe-9d81-f96eb043eac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.921688 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-registry-tls\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.921688 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/69e56ba4-66ad-49fe-9d81-f96eb043eac7-crio-socket\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.921839 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921732 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/69e56ba4-66ad-49fe-9d81-f96eb043eac7-crio-socket\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.921839 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921771 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/69e56ba4-66ad-49fe-9d81-f96eb043eac7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.921839 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86236ac0-ed67-423e-be76-0a7f6bcba48b-trusted-ca\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.921839 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86236ac0-ed67-423e-be76-0a7f6bcba48b-ca-trust-extracted\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922039 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921843 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86236ac0-ed67-423e-be76-0a7f6bcba48b-registry-certificates\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922039 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.921969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/69e56ba4-66ad-49fe-9d81-f96eb043eac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.922160 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922064 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/69e56ba4-66ad-49fe-9d81-f96eb043eac7-data-volume\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.922160 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922114 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chndf\" (UniqueName: \"kubernetes.io/projected/69e56ba4-66ad-49fe-9d81-f96eb043eac7-kube-api-access-chndf\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.922268 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-bound-sa-token\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922268 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922206 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86236ac0-ed67-423e-be76-0a7f6bcba48b-ca-trust-extracted\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922268 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922214 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86236ac0-ed67-423e-be76-0a7f6bcba48b-image-registry-private-configuration\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922421 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922380 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/69e56ba4-66ad-49fe-9d81-f96eb043eac7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.922421 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86236ac0-ed67-423e-be76-0a7f6bcba48b-installation-pull-secrets\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922525 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcn8x\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-kube-api-access-fcn8x\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922795 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922774 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86236ac0-ed67-423e-be76-0a7f6bcba48b-registry-certificates\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922899 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922827 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86236ac0-ed67-423e-be76-0a7f6bcba48b-trusted-ca\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.922899 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.922791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/69e56ba4-66ad-49fe-9d81-f96eb043eac7-data-volume\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.924533 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.924508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-registry-tls\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.924533 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.924529 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86236ac0-ed67-423e-be76-0a7f6bcba48b-image-registry-private-configuration\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.924692 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.924570 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/69e56ba4-66ad-49fe-9d81-f96eb043eac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.924989 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.924973 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86236ac0-ed67-423e-be76-0a7f6bcba48b-installation-pull-secrets\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.930912 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.930884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcn8x\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-kube-api-access-fcn8x\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.931206 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.931182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86236ac0-ed67-423e-be76-0a7f6bcba48b-bound-sa-token\") pod \"image-registry-54d7d6bc55-jchg9\" (UID: \"86236ac0-ed67-423e-be76-0a7f6bcba48b\") " pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:44.931457 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.931441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chndf\" (UniqueName: \"kubernetes.io/projected/69e56ba4-66ad-49fe-9d81-f96eb043eac7-kube-api-access-chndf\") pod \"insights-runtime-extractor-4m2cs\" (UID: \"69e56ba4-66ad-49fe-9d81-f96eb043eac7\") " pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.945431 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.945412 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4m2cs" Apr 17 09:13:44.962104 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:44.962085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:45.070249 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.070215 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4m2cs"] Apr 17 09:13:45.072788 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:45.072762 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e56ba4_66ad_49fe_9d81_f96eb043eac7.slice/crio-bffa3bd7cadfa3f1a19c3c268b0086869cbe3c7358f74b40546af423aedace26 WatchSource:0}: Error finding container bffa3bd7cadfa3f1a19c3c268b0086869cbe3c7358f74b40546af423aedace26: Status 404 returned error can't find the container with id bffa3bd7cadfa3f1a19c3c268b0086869cbe3c7358f74b40546af423aedace26 Apr 17 09:13:45.086323 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.086303 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54d7d6bc55-jchg9"] Apr 17 09:13:45.088985 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:45.088956 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86236ac0_ed67_423e_be76_0a7f6bcba48b.slice/crio-0878fabd10a0121d58169c706171e4f0c319fd00f758b678db6d927eee9a9f27 WatchSource:0}: Error finding container 0878fabd10a0121d58169c706171e4f0c319fd00f758b678db6d927eee9a9f27: Status 404 returned error can't find the container with id 0878fabd10a0121d58169c706171e4f0c319fd00f758b678db6d927eee9a9f27 Apr 17 09:13:45.124079 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.124059 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:45.124194 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.124091 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:45.124194 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.124123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:45.124721 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.124696 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-service-ca-bundle\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:45.126241 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.126223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1530bfd9-b6fd-487c-a92b-e509c50b4f9c-metrics-certs\") pod \"router-default-7f87cdbc66-wpc5d\" (UID: \"1530bfd9-b6fd-487c-a92b-e509c50b4f9c\") " pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:45.126325 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.126284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/70002831-91b1-409a-932e-a0ca4d141e25-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4wscl\" (UID: \"70002831-91b1-409a-932e-a0ca4d141e25\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:45.347652 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.347617 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:45.364439 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.364363 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" Apr 17 09:13:45.499761 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.499715 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7f87cdbc66-wpc5d"] Apr 17 09:13:45.502047 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:45.502022 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1530bfd9_b6fd_487c_a92b_e509c50b4f9c.slice/crio-0dc4c9aaa2a4fff276507e270d5847f4aa1eb1469d584698d272f749ee06959f WatchSource:0}: Error finding container 0dc4c9aaa2a4fff276507e270d5847f4aa1eb1469d584698d272f749ee06959f: Status 404 returned error can't find the container with id 0dc4c9aaa2a4fff276507e270d5847f4aa1eb1469d584698d272f749ee06959f Apr 17 09:13:45.511626 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.511601 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl"] Apr 17 09:13:45.658263 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.658176 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" event={"ID":"70002831-91b1-409a-932e-a0ca4d141e25","Type":"ContainerStarted","Data":"b58008f49be0293092296b2d5a1b3750d903ac969f09e56018a25c2827980421"} Apr 17 09:13:45.659789 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.659724 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" event={"ID":"1530bfd9-b6fd-487c-a92b-e509c50b4f9c","Type":"ContainerStarted","Data":"3471ab6bf8ef7f28f170cbb6cbbeaea5b2061c70b8df557605a3773827eacba2"} Apr 17 09:13:45.659789 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.659766 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" event={"ID":"1530bfd9-b6fd-487c-a92b-e509c50b4f9c","Type":"ContainerStarted","Data":"0dc4c9aaa2a4fff276507e270d5847f4aa1eb1469d584698d272f749ee06959f"} Apr 17 09:13:45.661321 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.661292 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m2cs" event={"ID":"69e56ba4-66ad-49fe-9d81-f96eb043eac7","Type":"ContainerStarted","Data":"09225b3ff9c0334e3e70d0762046f3f50598157c777f2dcd0bc1608e10ef4166"} Apr 17 09:13:45.661458 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.661327 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m2cs" event={"ID":"69e56ba4-66ad-49fe-9d81-f96eb043eac7","Type":"ContainerStarted","Data":"bffa3bd7cadfa3f1a19c3c268b0086869cbe3c7358f74b40546af423aedace26"} Apr 17 09:13:45.662724 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.662697 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" event={"ID":"86236ac0-ed67-423e-be76-0a7f6bcba48b","Type":"ContainerStarted","Data":"374d3332474badd12172cdd980289f1a018490945178a4861b189ac470b90f25"} Apr 17 09:13:45.662821 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.662730 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" event={"ID":"86236ac0-ed67-423e-be76-0a7f6bcba48b","Type":"ContainerStarted","Data":"0878fabd10a0121d58169c706171e4f0c319fd00f758b678db6d927eee9a9f27"} Apr 17 09:13:45.662821 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.662807 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:13:45.677000 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.676963 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" podStartSLOduration=32.676951812 podStartE2EDuration="32.676951812s" podCreationTimestamp="2026-04-17 09:13:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:13:45.676530657 +0000 UTC m=+159.001348633" watchObservedRunningTime="2026-04-17 09:13:45.676951812 +0000 UTC m=+159.001769787" Apr 17 09:13:45.694872 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:45.694826 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" podStartSLOduration=1.694814943 podStartE2EDuration="1.694814943s" podCreationTimestamp="2026-04-17 09:13:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:13:45.693531169 +0000 UTC m=+159.018349178" watchObservedRunningTime="2026-04-17 09:13:45.694814943 +0000 UTC m=+159.019632919" Apr 17 09:13:46.348177 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:46.348124 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:46.350877 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:46.350849 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:46.667262 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:46.667175 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m2cs" event={"ID":"69e56ba4-66ad-49fe-9d81-f96eb043eac7","Type":"ContainerStarted","Data":"6bf9a1446d069736ae334b3a62b65e8a48a4b6eaafc965f05b25220e3ab03c58"} Apr 17 09:13:46.667930 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:46.667901 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:46.669061 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:46.669039 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7f87cdbc66-wpc5d" Apr 17 09:13:47.882618 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:47.882584 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t"] Apr 17 09:13:47.888349 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:47.888323 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:47.891107 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:47.891087 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-6n499\"" Apr 17 09:13:47.891389 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:47.891372 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 09:13:47.894646 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:47.894620 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t"] Apr 17 09:13:47.945619 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:47.945583 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5dd39259-8f99-4b9d-b099-673b69e16722-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9j8t\" (UID: \"5dd39259-8f99-4b9d-b099-673b69e16722\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:48.046671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.046642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:13:48.046868 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.046714 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5dd39259-8f99-4b9d-b099-673b69e16722-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9j8t\" (UID: \"5dd39259-8f99-4b9d-b099-673b69e16722\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:48.046868 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:48.046793 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 09:13:48.046868 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:48.046854 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dd39259-8f99-4b9d-b099-673b69e16722-tls-certificates podName:5dd39259-8f99-4b9d-b099-673b69e16722 nodeName:}" failed. No retries permitted until 2026-04-17 09:13:48.54683521 +0000 UTC m=+161.871653185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/5dd39259-8f99-4b9d-b099-673b69e16722-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-b9j8t" (UID: "5dd39259-8f99-4b9d-b099-673b69e16722") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 09:13:48.047035 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.046889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:13:48.049282 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.049253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03aa1efb-f86a-42c4-b326-2a97e8287120-metrics-tls\") pod \"dns-default-x7pf8\" (UID: \"03aa1efb-f86a-42c4-b326-2a97e8287120\") " pod="openshift-dns/dns-default-x7pf8" Apr 17 09:13:48.049420 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.049394 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7e6e539-32e6-4df0-9447-66f765e64434-cert\") pod \"ingress-canary-4m6ml\" (UID: \"f7e6e539-32e6-4df0-9447-66f765e64434\") " pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:13:48.154585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.154518 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hdtpc\"" Apr 17 09:13:48.154585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.154564 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hv2j9\"" Apr 17 09:13:48.162789 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.162770 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7pf8" Apr 17 09:13:48.162898 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.162775 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4m6ml" Apr 17 09:13:48.282264 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.282233 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x7pf8"] Apr 17 09:13:48.285547 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:48.285522 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03aa1efb_f86a_42c4_b326_2a97e8287120.slice/crio-7d81e3dc700d8328c9df9c7f76795a2a8e7fec5af63af5974e3e9d1fc68d2437 WatchSource:0}: Error finding container 7d81e3dc700d8328c9df9c7f76795a2a8e7fec5af63af5974e3e9d1fc68d2437: Status 404 returned error can't find the container with id 7d81e3dc700d8328c9df9c7f76795a2a8e7fec5af63af5974e3e9d1fc68d2437 Apr 17 09:13:48.293979 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.293956 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4m6ml"] Apr 17 09:13:48.296659 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:48.296636 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e6e539_32e6_4df0_9447_66f765e64434.slice/crio-0d2110e9fcc7cbab8b5e968d53d00159534eb72c43a5d4bbb25d679d294fcdc2 WatchSource:0}: Error finding container 0d2110e9fcc7cbab8b5e968d53d00159534eb72c43a5d4bbb25d679d294fcdc2: Status 404 returned error can't find the container with id 0d2110e9fcc7cbab8b5e968d53d00159534eb72c43a5d4bbb25d679d294fcdc2 Apr 17 09:13:48.550479 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.550441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5dd39259-8f99-4b9d-b099-673b69e16722-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9j8t\" (UID: \"5dd39259-8f99-4b9d-b099-673b69e16722\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:48.552758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.552740 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5dd39259-8f99-4b9d-b099-673b69e16722-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-b9j8t\" (UID: \"5dd39259-8f99-4b9d-b099-673b69e16722\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:48.674814 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.674691 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" event={"ID":"70002831-91b1-409a-932e-a0ca4d141e25","Type":"ContainerStarted","Data":"610629a6639f1985a4fb6060f9bd5fe5789a8e7b83f03eb92f4065bfcfcb25cc"} Apr 17 09:13:48.674814 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.674735 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" event={"ID":"70002831-91b1-409a-932e-a0ca4d141e25","Type":"ContainerStarted","Data":"7371b029c50eb13f2e0bbee021d3a6e964c659f4a5adb40cb3ece6a50f9a1fef"} Apr 17 09:13:48.677193 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.677160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m2cs" event={"ID":"69e56ba4-66ad-49fe-9d81-f96eb043eac7","Type":"ContainerStarted","Data":"40c7045fe46651e69b29f1235f3843f68671271a3b50aaa7a542820577114b50"} Apr 17 09:13:48.678659 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.678606 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7pf8" event={"ID":"03aa1efb-f86a-42c4-b326-2a97e8287120","Type":"ContainerStarted","Data":"7d81e3dc700d8328c9df9c7f76795a2a8e7fec5af63af5974e3e9d1fc68d2437"} Apr 17 09:13:48.679954 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.679899 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4m6ml" event={"ID":"f7e6e539-32e6-4df0-9447-66f765e64434","Type":"ContainerStarted","Data":"0d2110e9fcc7cbab8b5e968d53d00159534eb72c43a5d4bbb25d679d294fcdc2"} Apr 17 09:13:48.691349 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.691252 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4wscl" podStartSLOduration=33.440375046 podStartE2EDuration="35.691235357s" podCreationTimestamp="2026-04-17 09:13:13 +0000 UTC" firstStartedPulling="2026-04-17 09:13:45.550528305 +0000 UTC m=+158.875346263" lastFinishedPulling="2026-04-17 09:13:47.80138861 +0000 UTC m=+161.126206574" observedRunningTime="2026-04-17 09:13:48.690859631 +0000 UTC m=+162.015677611" watchObservedRunningTime="2026-04-17 09:13:48.691235357 +0000 UTC m=+162.016053336" Apr 17 09:13:48.709410 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.709372 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4m2cs" podStartSLOduration=2.035191872 podStartE2EDuration="4.709348336s" podCreationTimestamp="2026-04-17 09:13:44 +0000 UTC" firstStartedPulling="2026-04-17 09:13:45.129167501 +0000 UTC m=+158.453985454" lastFinishedPulling="2026-04-17 09:13:47.803323957 +0000 UTC m=+161.128141918" observedRunningTime="2026-04-17 09:13:48.708516577 +0000 UTC m=+162.033334552" watchObservedRunningTime="2026-04-17 09:13:48.709348336 +0000 UTC m=+162.034166352" Apr 17 09:13:48.815149 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.814886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:48.948321 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:48.948267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t"] Apr 17 09:13:48.952313 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:48.952280 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd39259_8f99_4b9d_b099_673b69e16722.slice/crio-2295fe77bb158fe377a992f9f00c54ecb770f4bb835d0cf0f62d657bef523745 WatchSource:0}: Error finding container 2295fe77bb158fe377a992f9f00c54ecb770f4bb835d0cf0f62d657bef523745: Status 404 returned error can't find the container with id 2295fe77bb158fe377a992f9f00c54ecb770f4bb835d0cf0f62d657bef523745 Apr 17 09:13:49.683514 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:49.683476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" event={"ID":"5dd39259-8f99-4b9d-b099-673b69e16722","Type":"ContainerStarted","Data":"2295fe77bb158fe377a992f9f00c54ecb770f4bb835d0cf0f62d657bef523745"} Apr 17 09:13:50.688628 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:50.688552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7pf8" event={"ID":"03aa1efb-f86a-42c4-b326-2a97e8287120","Type":"ContainerStarted","Data":"d8bf7a3778d47cc0b9ede87f53191f852b4b0d66d1131b02bde2073b14addd95"} Apr 17 09:13:50.688628 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:50.688592 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7pf8" event={"ID":"03aa1efb-f86a-42c4-b326-2a97e8287120","Type":"ContainerStarted","Data":"4bc918953ab8ed59ea36b5e59a0793c5e9596b288f0363a2e257c816181762b9"} Apr 17 09:13:50.689131 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:50.688653 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x7pf8" Apr 17 09:13:50.689948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:50.689922 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4m6ml" event={"ID":"f7e6e539-32e6-4df0-9447-66f765e64434","Type":"ContainerStarted","Data":"4543723a124dd13f08adaf27ea6bdcd5136c0155b26f6449eff3a5fabe2cc479"} Apr 17 09:13:50.707249 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:50.707211 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x7pf8" podStartSLOduration=128.933124196 podStartE2EDuration="2m10.707197584s" podCreationTimestamp="2026-04-17 09:11:40 +0000 UTC" firstStartedPulling="2026-04-17 09:13:48.287371465 +0000 UTC m=+161.612189423" lastFinishedPulling="2026-04-17 09:13:50.061444857 +0000 UTC m=+163.386262811" observedRunningTime="2026-04-17 09:13:50.705401195 +0000 UTC m=+164.030219169" watchObservedRunningTime="2026-04-17 09:13:50.707197584 +0000 UTC m=+164.032015559" Apr 17 09:13:50.722984 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:50.722948 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4m6ml" podStartSLOduration=128.95623294 podStartE2EDuration="2m10.722935846s" podCreationTimestamp="2026-04-17 09:11:40 +0000 UTC" firstStartedPulling="2026-04-17 09:13:48.298309091 +0000 UTC m=+161.623127054" lastFinishedPulling="2026-04-17 09:13:50.065012002 +0000 UTC m=+163.389829960" observedRunningTime="2026-04-17 09:13:50.72205743 +0000 UTC m=+164.046875406" watchObservedRunningTime="2026-04-17 09:13:50.722935846 +0000 UTC m=+164.047753816" Apr 17 09:13:51.693468 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.693417 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" event={"ID":"5dd39259-8f99-4b9d-b099-673b69e16722","Type":"ContainerStarted","Data":"09a434294bb7f0fedca45a261c335968012badcfa5fe028e4d97564a02fd3f7d"} Apr 17 09:13:51.693848 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.693689 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:51.698211 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.698190 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" Apr 17 09:13:51.708902 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.708864 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-b9j8t" podStartSLOduration=2.910208107 podStartE2EDuration="4.708853339s" podCreationTimestamp="2026-04-17 09:13:47 +0000 UTC" firstStartedPulling="2026-04-17 09:13:48.955085709 +0000 UTC m=+162.279903669" lastFinishedPulling="2026-04-17 09:13:50.753730933 +0000 UTC m=+164.078548901" observedRunningTime="2026-04-17 09:13:51.708032122 +0000 UTC m=+165.032850098" watchObservedRunningTime="2026-04-17 09:13:51.708853339 +0000 UTC m=+165.033671315" Apr 17 09:13:51.934974 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.934940 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfsj8"] Apr 17 09:13:51.938291 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.938276 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:51.941044 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.941016 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 09:13:51.941044 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.941030 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 09:13:51.941240 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.941112 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 09:13:51.942360 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.942343 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-56vzb\"" Apr 17 09:13:51.942429 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.942352 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 09:13:51.942429 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.942385 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 09:13:51.948637 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.948616 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfsj8"] Apr 17 09:13:51.977765 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.977746 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:51.977849 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.977780 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:51.977849 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.977802 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:51.977928 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:51.977849 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhb8\" (UniqueName: \"kubernetes.io/projected/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-kube-api-access-6jhb8\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.078563 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.078534 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.078658 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.078575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.078658 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.078601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhb8\" (UniqueName: \"kubernetes.io/projected/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-kube-api-access-6jhb8\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.078765 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.078686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.079367 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.079348 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.081054 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.081032 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.081054 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.081040 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.086529 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.086508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhb8\" (UniqueName: \"kubernetes.io/projected/b2cd0079-2e4b-4644-b18a-532ddc80ab8f-kube-api-access-6jhb8\") pod \"prometheus-operator-5676c8c784-rfsj8\" (UID: \"b2cd0079-2e4b-4644-b18a-532ddc80ab8f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.246606 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.246570 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" Apr 17 09:13:52.362460 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.362408 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfsj8"] Apr 17 09:13:52.365710 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:52.365681 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2cd0079_2e4b_4644_b18a_532ddc80ab8f.slice/crio-2c00cf388173dd6caff93e3b496e4724458f0a5421339604d127d426584eca81 WatchSource:0}: Error finding container 2c00cf388173dd6caff93e3b496e4724458f0a5421339604d127d426584eca81: Status 404 returned error can't find the container with id 2c00cf388173dd6caff93e3b496e4724458f0a5421339604d127d426584eca81 Apr 17 09:13:52.697392 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:52.697360 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" event={"ID":"b2cd0079-2e4b-4644-b18a-532ddc80ab8f","Type":"ContainerStarted","Data":"2c00cf388173dd6caff93e3b496e4724458f0a5421339604d127d426584eca81"} Apr 17 09:13:54.248402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:54.248368 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:13:54.705754 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:54.705721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" event={"ID":"b2cd0079-2e4b-4644-b18a-532ddc80ab8f","Type":"ContainerStarted","Data":"89411d1084f773e4733e5fe30923a7cd744e02a2611b233ba92ebc9b8206abbb"} Apr 17 09:13:54.705896 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:54.705762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" event={"ID":"b2cd0079-2e4b-4644-b18a-532ddc80ab8f","Type":"ContainerStarted","Data":"8e5a38aa98ac5fb46fa47b6d6b35c0c2f57813df56906c6f2c4b8f66d80068e0"} Apr 17 09:13:54.723829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:54.723778 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfsj8" podStartSLOduration=2.440882648 podStartE2EDuration="3.723760494s" podCreationTimestamp="2026-04-17 09:13:51 +0000 UTC" firstStartedPulling="2026-04-17 09:13:52.367563363 +0000 UTC m=+165.692381321" lastFinishedPulling="2026-04-17 09:13:53.650441204 +0000 UTC m=+166.975259167" observedRunningTime="2026-04-17 09:13:54.723703244 +0000 UTC m=+168.048521221" watchObservedRunningTime="2026-04-17 09:13:54.723760494 +0000 UTC m=+168.048578471" Apr 17 09:13:55.248766 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:55.248737 2567 scope.go:117] "RemoveContainer" containerID="ee2d8bc12d1cf8940dfd348ecbc5514aa023711aa839dfe4000912b2d071ae27" Apr 17 09:13:55.249102 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:55.248965 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-wgk7d_openshift-console-operator(02e41852-3045-415d-bbb7-fc08dc3cfe5f)\"" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podUID="02e41852-3045-415d-bbb7-fc08dc3cfe5f" Apr 17 09:13:56.294275 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.294196 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rjhw8"] Apr 17 09:13:56.297420 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.297401 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.300378 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.300359 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 09:13:56.301714 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.301692 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xkdtg\"" Apr 17 09:13:56.301813 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.301778 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 09:13:56.301947 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.301928 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 09:13:56.411391 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411367 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-sys\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411500 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fba41d7-58a2-421f-89c9-5704fb7a073c-metrics-client-ca\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411500 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411460 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw58x\" (UniqueName: \"kubernetes.io/projected/0fba41d7-58a2-421f-89c9-5704fb7a073c-kube-api-access-vw58x\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411639 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411508 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-textfile\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411639 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411526 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-tls\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411639 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-root\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411639 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411605 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-accelerators-collector-config\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411793 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411640 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.411793 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.411673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-wtmp\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512091 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-root\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-accelerators-collector-config\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-root\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-wtmp\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512432 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-sys\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512432 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fba41d7-58a2-421f-89c9-5704fb7a073c-metrics-client-ca\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512432 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw58x\" (UniqueName: \"kubernetes.io/projected/0fba41d7-58a2-421f-89c9-5704fb7a073c-kube-api-access-vw58x\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512432 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512339 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-sys\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512432 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-textfile\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512432 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512408 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-wtmp\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512709 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-tls\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512709 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512653 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-textfile\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512709 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:56.512683 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 09:13:56.512825 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-accelerators-collector-config\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.512825 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:13:56.512748 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-tls podName:0fba41d7-58a2-421f-89c9-5704fb7a073c nodeName:}" failed. No retries permitted until 2026-04-17 09:13:57.012730366 +0000 UTC m=+170.337548340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-tls") pod "node-exporter-rjhw8" (UID: "0fba41d7-58a2-421f-89c9-5704fb7a073c") : secret "node-exporter-tls" not found Apr 17 09:13:56.512903 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.512873 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fba41d7-58a2-421f-89c9-5704fb7a073c-metrics-client-ca\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.514603 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.514582 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:56.520600 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:56.520581 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw58x\" (UniqueName: \"kubernetes.io/projected/0fba41d7-58a2-421f-89c9-5704fb7a073c-kube-api-access-vw58x\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:57.017453 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:57.017416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-tls\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:57.019822 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:57.019798 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0fba41d7-58a2-421f-89c9-5704fb7a073c-node-exporter-tls\") pod \"node-exporter-rjhw8\" (UID: \"0fba41d7-58a2-421f-89c9-5704fb7a073c\") " pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:57.206188 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:57.206164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rjhw8" Apr 17 09:13:57.214052 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:57.214025 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fba41d7_58a2_421f_89c9_5704fb7a073c.slice/crio-ede384168834a5e22b11a33f317317eba82bdeee7c36c01a5b6f358988ddb294 WatchSource:0}: Error finding container ede384168834a5e22b11a33f317317eba82bdeee7c36c01a5b6f358988ddb294: Status 404 returned error can't find the container with id ede384168834a5e22b11a33f317317eba82bdeee7c36c01a5b6f358988ddb294 Apr 17 09:13:57.714168 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:57.714115 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjhw8" event={"ID":"0fba41d7-58a2-421f-89c9-5704fb7a073c","Type":"ContainerStarted","Data":"ede384168834a5e22b11a33f317317eba82bdeee7c36c01a5b6f358988ddb294"} Apr 17 09:13:58.263087 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.263056 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6886c85cbb-w654p"] Apr 17 09:13:58.266740 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.266715 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.269572 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.269536 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 09:13:58.269572 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.269541 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 09:13:58.269762 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.269591 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fandmf6gssku3\"" Apr 17 09:13:58.269871 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.269850 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 09:13:58.269965 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.269907 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 09:13:58.270108 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.270093 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 09:13:58.270108 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.270103 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-4t9nz\"" Apr 17 09:13:58.276641 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.276620 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6886c85cbb-w654p"] Apr 17 09:13:58.328184 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328161 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-tls\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328281 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328207 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328281 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328240 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rssx\" (UniqueName: \"kubernetes.io/projected/c0ca7312-1b2f-4414-baf4-0e7960c87909-kube-api-access-7rssx\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328281 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328272 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328386 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328363 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0ca7312-1b2f-4414-baf4-0e7960c87909-metrics-client-ca\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328429 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-grpc-tls\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328460 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328450 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.328493 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.328481 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429538 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429465 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429538 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429505 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429538 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429529 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-tls\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429764 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429764 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429672 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rssx\" (UniqueName: \"kubernetes.io/projected/c0ca7312-1b2f-4414-baf4-0e7960c87909-kube-api-access-7rssx\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429764 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429899 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429767 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0ca7312-1b2f-4414-baf4-0e7960c87909-metrics-client-ca\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.429963 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.429903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-grpc-tls\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.430556 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.430528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0ca7312-1b2f-4414-baf4-0e7960c87909-metrics-client-ca\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.432169 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.432098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.432275 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.432248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-tls\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.432413 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.432395 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.432467 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.432422 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.432658 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.432637 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-grpc-tls\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.432710 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.432657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c0ca7312-1b2f-4414-baf4-0e7960c87909-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.437578 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.437558 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rssx\" (UniqueName: \"kubernetes.io/projected/c0ca7312-1b2f-4414-baf4-0e7960c87909-kube-api-access-7rssx\") pod \"thanos-querier-6886c85cbb-w654p\" (UID: \"c0ca7312-1b2f-4414-baf4-0e7960c87909\") " pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.577205 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.577163 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:13:58.692353 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.692282 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6886c85cbb-w654p"] Apr 17 09:13:58.695046 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:13:58.695021 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ca7312_1b2f_4414_baf4_0e7960c87909.slice/crio-3a6f5816171acae6e5f9e59c695f251653f800de0e4e3e5e4304948fffe84bc2 WatchSource:0}: Error finding container 3a6f5816171acae6e5f9e59c695f251653f800de0e4e3e5e4304948fffe84bc2: Status 404 returned error can't find the container with id 3a6f5816171acae6e5f9e59c695f251653f800de0e4e3e5e4304948fffe84bc2 Apr 17 09:13:58.723130 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.723099 2567 generic.go:358] "Generic (PLEG): container finished" podID="0fba41d7-58a2-421f-89c9-5704fb7a073c" containerID="d781d2a2db14940d58b655383a3c9e6cfde5cfc46122d236f965c1c2def66096" exitCode=0 Apr 17 09:13:58.723495 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.723184 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjhw8" event={"ID":"0fba41d7-58a2-421f-89c9-5704fb7a073c","Type":"ContainerDied","Data":"d781d2a2db14940d58b655383a3c9e6cfde5cfc46122d236f965c1c2def66096"} Apr 17 09:13:58.724272 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:58.724255 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"3a6f5816171acae6e5f9e59c695f251653f800de0e4e3e5e4304948fffe84bc2"} Apr 17 09:13:59.729820 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:59.729782 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjhw8" event={"ID":"0fba41d7-58a2-421f-89c9-5704fb7a073c","Type":"ContainerStarted","Data":"b6eceb1de7561342780652d14f629bc20b122ee71ecd5d81c85daf4a645455a2"} Apr 17 09:13:59.729820 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:59.729820 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rjhw8" event={"ID":"0fba41d7-58a2-421f-89c9-5704fb7a073c","Type":"ContainerStarted","Data":"708d13ac11cf7f2513f13a8cc91816c1a55b1de6f0cff6c5c98d99a085b5db22"} Apr 17 09:13:59.751429 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:13:59.751376 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rjhw8" podStartSLOduration=2.823688839 podStartE2EDuration="3.751357405s" podCreationTimestamp="2026-04-17 09:13:56 +0000 UTC" firstStartedPulling="2026-04-17 09:13:57.215537509 +0000 UTC m=+170.540355463" lastFinishedPulling="2026-04-17 09:13:58.143206065 +0000 UTC m=+171.468024029" observedRunningTime="2026-04-17 09:13:59.749166665 +0000 UTC m=+173.073984641" watchObservedRunningTime="2026-04-17 09:13:59.751357405 +0000 UTC m=+173.076175382" Apr 17 09:14:00.695777 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:00.695749 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x7pf8" Apr 17 09:14:00.735185 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:00.735153 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"da68a2f999afe54a4e5b31d9b9cf188be4c3128ddac589927ba7fa1819936364"} Apr 17 09:14:00.735185 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:00.735189 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"bdaea509dcc063d780cfe835c40597deb8972d9403085936fd45f96525f571d9"} Apr 17 09:14:00.735649 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:00.735198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"bed85b54e28561d5342fdd9b3f75714824a501f4c806ed763bac47e6f27e9230"} Apr 17 09:14:01.740292 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:01.740252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"e4b85e3c78cafa751adade4ea251e4fb9babe31474ab99dd964ea690174990e2"} Apr 17 09:14:01.740717 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:01.740300 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"1e9d49e90ec57b00df73bcbca939a1bcd3e20e5bddec29761ac65b2b80c26537"} Apr 17 09:14:01.740717 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:01.740315 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" event={"ID":"c0ca7312-1b2f-4414-baf4-0e7960c87909","Type":"ContainerStarted","Data":"8c0ed06a828d40356f177b25582dedfd8a4bd4e43290c2fddf65cc3825fc9511"} Apr 17 09:14:01.740717 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:01.740407 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:14:01.764475 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:01.764386 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" podStartSLOduration=0.926682375 podStartE2EDuration="3.764369734s" podCreationTimestamp="2026-04-17 09:13:58 +0000 UTC" firstStartedPulling="2026-04-17 09:13:58.696865635 +0000 UTC m=+172.021683589" lastFinishedPulling="2026-04-17 09:14:01.534552995 +0000 UTC m=+174.859370948" observedRunningTime="2026-04-17 09:14:01.762626388 +0000 UTC m=+175.087444364" watchObservedRunningTime="2026-04-17 09:14:01.764369734 +0000 UTC m=+175.089187716" Apr 17 09:14:02.482784 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.482748 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:14:02.487217 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.487195 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.491943 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.491918 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 09:14:02.493164 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.492958 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 09:14:02.493164 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.492982 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 09:14:02.493164 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493002 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-x4ffg\"" Apr 17 09:14:02.493164 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493091 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 09:14:02.493164 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493011 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 09:14:02.493458 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493248 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 09:14:02.493458 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493434 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 09:14:02.493558 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493460 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-epevmuls5p3q7\"" Apr 17 09:14:02.493606 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493593 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 09:14:02.493685 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493670 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 09:14:02.493802 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493787 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 09:14:02.493866 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.493801 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 09:14:02.498117 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.498096 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 09:14:02.498504 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.498485 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 09:14:02.503604 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.503580 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:14:02.565093 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-config-out\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565305 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565102 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565305 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565154 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565305 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565242 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-config\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565305 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565414 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565454 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565505 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q2n\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-kube-api-access-c5q2n\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565533 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565591 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565648 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.565833 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.565717 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-web-config\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666330 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666296 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666510 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666333 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666510 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666358 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q2n\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-kube-api-access-c5q2n\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666510 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666510 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666510 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666533 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666615 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-web-config\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-config-out\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666696 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.666744 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-config\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.667069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666785 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.667069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.667069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666849 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.667069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.667069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.666924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.667341 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.667192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.669567 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.669538 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-web-config\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.669967 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.669942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672628 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.670114 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672725 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.670458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672725 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.670820 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672725 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.671811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672725 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.672312 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672725 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.672684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672982 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.672482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.672982 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.672861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.673206 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.673187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.673312 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.673277 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.673934 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.673831 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.674113 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.674091 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-config-out\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.674230 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.674212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-config\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.675254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.675236 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.678809 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.678763 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q2n\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-kube-api-access-c5q2n\") pod \"prometheus-k8s-0\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.800101 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.799553 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:02.940162 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:02.940117 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:14:02.941778 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:14:02.941751 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91824af_921e_40e1_b026_fd08ab9b26d3.slice/crio-201474fc9e27b758391273a3b1552cb011ee788dfcd9651aeaf90f0635346f9d WatchSource:0}: Error finding container 201474fc9e27b758391273a3b1552cb011ee788dfcd9651aeaf90f0635346f9d: Status 404 returned error can't find the container with id 201474fc9e27b758391273a3b1552cb011ee788dfcd9651aeaf90f0635346f9d Apr 17 09:14:03.746531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:03.746493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"201474fc9e27b758391273a3b1552cb011ee788dfcd9651aeaf90f0635346f9d"} Apr 17 09:14:04.750736 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:04.750704 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02" exitCode=0 Apr 17 09:14:04.751063 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:04.750752 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02"} Apr 17 09:14:06.672186 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:06.672154 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54d7d6bc55-jchg9" Apr 17 09:14:07.749705 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:07.749625 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6886c85cbb-w654p" Apr 17 09:14:07.762231 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:07.762158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3"} Apr 17 09:14:07.762231 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:07.762201 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f"} Apr 17 09:14:08.767889 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:08.767855 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a"} Apr 17 09:14:08.767889 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:08.767890 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824"} Apr 17 09:14:08.767889 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:08.767899 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee"} Apr 17 09:14:08.768431 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:08.767908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerStarted","Data":"3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82"} Apr 17 09:14:08.794913 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:08.794854 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.1744581099999998 podStartE2EDuration="6.794828886s" podCreationTimestamp="2026-04-17 09:14:02 +0000 UTC" firstStartedPulling="2026-04-17 09:14:02.943685114 +0000 UTC m=+176.268503067" lastFinishedPulling="2026-04-17 09:14:07.564055874 +0000 UTC m=+180.888873843" observedRunningTime="2026-04-17 09:14:08.793215087 +0000 UTC m=+182.118033064" watchObservedRunningTime="2026-04-17 09:14:08.794828886 +0000 UTC m=+182.119646863" Apr 17 09:14:10.248031 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:10.247997 2567 scope.go:117] "RemoveContainer" containerID="ee2d8bc12d1cf8940dfd348ecbc5514aa023711aa839dfe4000912b2d071ae27" Apr 17 09:14:10.774674 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:10.774648 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/2.log" Apr 17 09:14:10.774831 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:10.774755 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" event={"ID":"02e41852-3045-415d-bbb7-fc08dc3cfe5f","Type":"ContainerStarted","Data":"47bc87a01eed469d079122fc9721f9f5929f509a4624b36ff76d7371e877d1ab"} Apr 17 09:14:10.775045 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:10.775020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:14:10.792327 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:10.792281 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" podStartSLOduration=56.028262117 podStartE2EDuration="57.79226749s" podCreationTimestamp="2026-04-17 09:13:13 +0000 UTC" firstStartedPulling="2026-04-17 09:13:13.65110623 +0000 UTC m=+126.975924193" lastFinishedPulling="2026-04-17 09:13:15.415111598 +0000 UTC m=+128.739929566" observedRunningTime="2026-04-17 09:14:10.791537891 +0000 UTC m=+184.116355867" watchObservedRunningTime="2026-04-17 09:14:10.79226749 +0000 UTC m=+184.117085460" Apr 17 09:14:11.596662 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:11.596632 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-wgk7d" Apr 17 09:14:12.800281 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:12.800256 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:14:30.830478 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:30.830446 2567 generic.go:358] "Generic (PLEG): container finished" podID="fb2625d8-aa03-43bc-a98b-f09bcdd8bf65" containerID="f0e27f83f6c3234747b9cac3bbfd420b6a84b9a7fbc907df3b0bdb4359279e9e" exitCode=0 Apr 17 09:14:30.830889 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:30.830486 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" event={"ID":"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65","Type":"ContainerDied","Data":"f0e27f83f6c3234747b9cac3bbfd420b6a84b9a7fbc907df3b0bdb4359279e9e"} Apr 17 09:14:30.830889 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:30.830790 2567 scope.go:117] "RemoveContainer" containerID="f0e27f83f6c3234747b9cac3bbfd420b6a84b9a7fbc907df3b0bdb4359279e9e" Apr 17 09:14:31.836001 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:14:31.835969 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-bjjz6" event={"ID":"fb2625d8-aa03-43bc-a98b-f09bcdd8bf65","Type":"ContainerStarted","Data":"321f82f86883d61d6fc7b841e59365034f590236a1435fc2c36985b4bf54683e"} Apr 17 09:15:02.800942 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:02.800909 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:02.816276 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:02.816251 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:02.940636 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:02.940599 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:18.910012 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:18.909976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:15:18.912291 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:18.912271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5029b845-d556-4306-b1bb-4c6373b7e4be-metrics-certs\") pod \"network-metrics-daemon-qpnfd\" (UID: \"5029b845-d556-4306-b1bb-4c6373b7e4be\") " pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:15:19.152186 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:19.152154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbqzc\"" Apr 17 09:15:19.159584 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:19.159565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpnfd" Apr 17 09:15:19.274484 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:19.274460 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qpnfd"] Apr 17 09:15:19.276919 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:15:19.276891 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5029b845_d556_4306_b1bb_4c6373b7e4be.slice/crio-146304a07deb0d534536e1b5738760a3fae81a6173c8ce8c00490c64ab816cc1 WatchSource:0}: Error finding container 146304a07deb0d534536e1b5738760a3fae81a6173c8ce8c00490c64ab816cc1: Status 404 returned error can't find the container with id 146304a07deb0d534536e1b5738760a3fae81a6173c8ce8c00490c64ab816cc1 Apr 17 09:15:19.972073 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:19.972034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qpnfd" event={"ID":"5029b845-d556-4306-b1bb-4c6373b7e4be","Type":"ContainerStarted","Data":"146304a07deb0d534536e1b5738760a3fae81a6173c8ce8c00490c64ab816cc1"} Apr 17 09:15:20.880125 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880023 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:15:20.880684 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880649 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="prometheus" containerID="cri-o://e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f" gracePeriod=600 Apr 17 09:15:20.880826 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880669 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy" containerID="cri-o://591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824" gracePeriod=600 Apr 17 09:15:20.880826 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880792 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-thanos" containerID="cri-o://7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a" gracePeriod=600 Apr 17 09:15:20.880935 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880827 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="config-reloader" containerID="cri-o://382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3" gracePeriod=600 Apr 17 09:15:20.880935 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880894 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-web" containerID="cri-o://a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee" gracePeriod=600 Apr 17 09:15:20.880935 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.880916 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="thanos-sidecar" containerID="cri-o://3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82" gracePeriod=600 Apr 17 09:15:20.977955 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.977101 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qpnfd" event={"ID":"5029b845-d556-4306-b1bb-4c6373b7e4be","Type":"ContainerStarted","Data":"125f23a1ef3303da7a4935021bd0ad8d543495ba99435bf26ad9f734ae367c5c"} Apr 17 09:15:20.977955 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.977176 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qpnfd" event={"ID":"5029b845-d556-4306-b1bb-4c6373b7e4be","Type":"ContainerStarted","Data":"e86d89413de95607706bf55aeb0de1d56b30376ffeca7868189dab742e144c72"} Apr 17 09:15:20.996127 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:20.996072 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qpnfd" podStartSLOduration=252.901530044 podStartE2EDuration="4m13.996052748s" podCreationTimestamp="2026-04-17 09:11:07 +0000 UTC" firstStartedPulling="2026-04-17 09:15:19.278575808 +0000 UTC m=+252.603393763" lastFinishedPulling="2026-04-17 09:15:20.373098514 +0000 UTC m=+253.697916467" observedRunningTime="2026-04-17 09:15:20.994461886 +0000 UTC m=+254.319279887" watchObservedRunningTime="2026-04-17 09:15:20.996052748 +0000 UTC m=+254.320870726" Apr 17 09:15:21.982033 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982009 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a" exitCode=0 Apr 17 09:15:21.982033 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982030 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824" exitCode=0 Apr 17 09:15:21.982033 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982036 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82" exitCode=0 Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982041 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3" exitCode=0 Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982046 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f" exitCode=0 Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982084 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a"} Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824"} Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982129 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82"} Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3"} Apr 17 09:15:21.982402 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:21.982165 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f"} Apr 17 09:15:22.131931 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.131912 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:22.240316 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240235 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-trusted-ca-bundle\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240316 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240269 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240316 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240292 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-config\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240316 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240312 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-rulefiles-0\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240338 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-db\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240359 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-grpc-tls\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240379 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-tls\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240394 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-serving-certs-ca-bundle\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240410 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-tls-assets\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240443 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240470 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-metrics-client-ca\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240494 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-thanos-prometheus-http-client-file\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240545 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5q2n\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-kube-api-access-c5q2n\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240583 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-web-config\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.240616 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240619 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-kube-rbac-proxy\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.241172 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240646 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-metrics-client-certs\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.241172 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240723 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-config-out\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.241172 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240740 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:22.241172 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.240754 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-kubelet-serving-ca-bundle\") pod \"f91824af-921e-40e1-b026-fd08ab9b26d3\" (UID: \"f91824af-921e-40e1-b026-fd08ab9b26d3\") " Apr 17 09:15:22.241172 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.241033 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.241434 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.241383 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:15:22.241501 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.241436 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:22.242219 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.241926 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:22.243466 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.243227 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-config" (OuterVolumeSpecName: "config") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.243466 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.243331 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.243800 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.243703 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.243800 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.243766 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:22.244014 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.243809 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.244014 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.243835 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.244152 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.244039 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:15:22.244326 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.244304 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:15:22.244796 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.244757 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-kube-api-access-c5q2n" (OuterVolumeSpecName: "kube-api-access-c5q2n") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "kube-api-access-c5q2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:15:22.244878 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.244833 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.245702 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.245674 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.245973 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.245949 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.246042 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.245958 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-config-out" (OuterVolumeSpecName: "config-out") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:15:22.255473 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.255453 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-web-config" (OuterVolumeSpecName: "web-config") pod "f91824af-921e-40e1-b026-fd08ab9b26d3" (UID: "f91824af-921e-40e1-b026-fd08ab9b26d3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:15:22.341755 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341724 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-config-out\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341755 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341756 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341755 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341767 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341777 2567 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-config\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341786 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341795 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f91824af-921e-40e1-b026-fd08ab9b26d3-prometheus-k8s-db\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341804 2567 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-grpc-tls\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341815 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341825 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341834 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-tls-assets\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341842 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341851 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f91824af-921e-40e1-b026-fd08ab9b26d3-configmap-metrics-client-ca\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341860 2567 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341868 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c5q2n\" (UniqueName: \"kubernetes.io/projected/f91824af-921e-40e1-b026-fd08ab9b26d3-kube-api-access-c5q2n\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341876 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-web-config\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341886 2567 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-kube-rbac-proxy\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.341948 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.341896 2567 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f91824af-921e-40e1-b026-fd08ab9b26d3-secret-metrics-client-certs\") on node \"ip-10-0-128-212.ec2.internal\" DevicePath \"\"" Apr 17 09:15:22.987722 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.987684 2567 generic.go:358] "Generic (PLEG): container finished" podID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerID="a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee" exitCode=0 Apr 17 09:15:22.988094 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.987763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee"} Apr 17 09:15:22.988094 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.987794 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f91824af-921e-40e1-b026-fd08ab9b26d3","Type":"ContainerDied","Data":"201474fc9e27b758391273a3b1552cb011ee788dfcd9651aeaf90f0635346f9d"} Apr 17 09:15:22.988094 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.987811 2567 scope.go:117] "RemoveContainer" containerID="7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a" Apr 17 09:15:22.988094 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.987822 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:22.995920 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:22.995600 2567 scope.go:117] "RemoveContainer" containerID="591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824" Apr 17 09:15:23.002325 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.002119 2567 scope.go:117] "RemoveContainer" containerID="a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee" Apr 17 09:15:23.009209 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.009190 2567 scope.go:117] "RemoveContainer" containerID="3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82" Apr 17 09:15:23.013080 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.013056 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:15:23.016554 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.016503 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:15:23.016600 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.016580 2567 scope.go:117] "RemoveContainer" containerID="382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3" Apr 17 09:15:23.022664 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.022649 2567 scope.go:117] "RemoveContainer" containerID="e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f" Apr 17 09:15:23.030852 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.030836 2567 scope.go:117] "RemoveContainer" containerID="61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02" Apr 17 09:15:23.036877 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.036862 2567 scope.go:117] "RemoveContainer" containerID="7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a" Apr 17 09:15:23.037152 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.037101 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a\": container with ID starting with 7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a not found: ID does not exist" containerID="7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a" Apr 17 09:15:23.037241 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037154 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a"} err="failed to get container status \"7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a\": rpc error: code = NotFound desc = could not find container \"7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a\": container with ID starting with 7565d1fd193d49f592776857e150ce063ba9d6dba17b1a363c8d6c70a7d4a06a not found: ID does not exist" Apr 17 09:15:23.037241 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037197 2567 scope.go:117] "RemoveContainer" containerID="591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824" Apr 17 09:15:23.037457 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.037438 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824\": container with ID starting with 591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824 not found: ID does not exist" containerID="591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824" Apr 17 09:15:23.037506 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037480 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824"} err="failed to get container status \"591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824\": rpc error: code = NotFound desc = could not find container \"591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824\": container with ID starting with 591cb313a8a6dd3452e9934439d9a1b32b42f4e676f69cad6f4e400d49c0d824 not found: ID does not exist" Apr 17 09:15:23.037506 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037498 2567 scope.go:117] "RemoveContainer" containerID="a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee" Apr 17 09:15:23.037706 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.037691 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee\": container with ID starting with a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee not found: ID does not exist" containerID="a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee" Apr 17 09:15:23.037750 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037709 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee"} err="failed to get container status \"a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee\": rpc error: code = NotFound desc = could not find container \"a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee\": container with ID starting with a119e0c47a475adefea6f86766e497c914e28e34559b520a0f50aa5d89d01dee not found: ID does not exist" Apr 17 09:15:23.037750 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037726 2567 scope.go:117] "RemoveContainer" containerID="3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82" Apr 17 09:15:23.037924 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.037909 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82\": container with ID starting with 3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82 not found: ID does not exist" containerID="3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82" Apr 17 09:15:23.037966 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037930 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82"} err="failed to get container status \"3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82\": rpc error: code = NotFound desc = could not find container \"3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82\": container with ID starting with 3c2b9646844c1249b8de3a8467a57640f40b3ffb8a2328f95836a5420be3fa82 not found: ID does not exist" Apr 17 09:15:23.037966 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.037946 2567 scope.go:117] "RemoveContainer" containerID="382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3" Apr 17 09:15:23.038185 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.038170 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3\": container with ID starting with 382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3 not found: ID does not exist" containerID="382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3" Apr 17 09:15:23.038244 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.038187 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3"} err="failed to get container status \"382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3\": rpc error: code = NotFound desc = could not find container \"382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3\": container with ID starting with 382fdf19f7a456d5e6d495fbe38e398e0209ada8bdf566054ea7ef9d33c85ed3 not found: ID does not exist" Apr 17 09:15:23.038244 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.038199 2567 scope.go:117] "RemoveContainer" containerID="e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f" Apr 17 09:15:23.038421 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.038405 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f\": container with ID starting with e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f not found: ID does not exist" containerID="e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f" Apr 17 09:15:23.038465 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.038426 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f"} err="failed to get container status \"e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f\": rpc error: code = NotFound desc = could not find container \"e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f\": container with ID starting with e6ac8678d8319e6d2472a69bed5eb0325cfe35118007bcc6b239b3f1eb97780f not found: ID does not exist" Apr 17 09:15:23.038465 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.038440 2567 scope.go:117] "RemoveContainer" containerID="61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02" Apr 17 09:15:23.038663 ip-10-0-128-212 kubenswrapper[2567]: E0417 09:15:23.038647 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02\": container with ID starting with 61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02 not found: ID does not exist" containerID="61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02" Apr 17 09:15:23.038704 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.038665 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02"} err="failed to get container status \"61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02\": rpc error: code = NotFound desc = could not find container \"61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02\": container with ID starting with 61ec79a9b457ac856b9fabeef41d4bcd86f40f63e81b47a8bd48e9f9ce6faf02 not found: ID does not exist" Apr 17 09:15:23.042861 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.042841 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043196 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="init-config-reloader" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043214 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="init-config-reloader" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043228 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-web" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043237 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-web" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043250 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="thanos-sidecar" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043259 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="thanos-sidecar" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043268 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-thanos" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043276 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-thanos" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043287 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="prometheus" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043295 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="prometheus" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043304 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="config-reloader" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043311 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="config-reloader" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043338 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043346 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043414 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043425 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="config-reloader" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043436 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="prometheus" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043447 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-web" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043455 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="thanos-sidecar" Apr 17 09:15:23.043916 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.043466 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" containerName="kube-rbac-proxy-thanos" Apr 17 09:15:23.049560 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.049540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.052708 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.052690 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 09:15:23.053062 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053045 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 09:15:23.053671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053373 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 09:15:23.053671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053424 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 09:15:23.053671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053424 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 09:15:23.053671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053430 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 09:15:23.053671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053550 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 09:15:23.053671 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053484 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-epevmuls5p3q7\"" Apr 17 09:15:23.053992 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053883 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 09:15:23.053992 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053959 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 09:15:23.054078 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053988 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-x4ffg\"" Apr 17 09:15:23.054078 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.053989 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 09:15:23.054456 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.054439 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 09:15:23.057548 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.057528 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 09:15:23.059659 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.059636 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 09:15:23.061052 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.061028 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:15:23.147602 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147567 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-config\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147675 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147727 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147912 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147912 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147816 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147912 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147862 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147912 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786b653f-ad7d-4090-a4e7-8867fa11147a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.147912 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147905 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2f9p\" (UniqueName: \"kubernetes.io/projected/786b653f-ad7d-4090-a4e7-8867fa11147a-kube-api-access-w2f9p\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148082 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147923 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148082 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.147941 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-web-config\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148082 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.148009 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786b653f-ad7d-4090-a4e7-8867fa11147a-config-out\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148082 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.148035 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148082 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.148055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148495 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.148085 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148495 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.148101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.148495 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.148117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249443 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2f9p\" (UniqueName: \"kubernetes.io/projected/786b653f-ad7d-4090-a4e7-8867fa11147a-kube-api-access-w2f9p\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-web-config\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249531 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786b653f-ad7d-4090-a4e7-8867fa11147a-config-out\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249573 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249640 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249690 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-config\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249720 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249756 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.249829 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.250368 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249841 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.250368 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.250368 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.250368 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.250368 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.249977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786b653f-ad7d-4090-a4e7-8867fa11147a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.251334 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.251015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.252554 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.252420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/786b653f-ad7d-4090-a4e7-8867fa11147a-config-out\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.252667 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.252575 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-config\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.252745 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.252714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.252847 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.252823 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-web-config\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.253332 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.253303 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.253580 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.253554 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.253658 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.253606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.254048 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.253819 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.254156 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.254081 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91824af-921e-40e1-b026-fd08ab9b26d3" path="/var/lib/kubelet/pods/f91824af-921e-40e1-b026-fd08ab9b26d3/volumes" Apr 17 09:15:23.254706 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.254468 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.254706 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.254628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.255349 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.255326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/786b653f-ad7d-4090-a4e7-8867fa11147a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.255758 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.255730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.255866 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.255765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.255983 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.255965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.256330 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.256314 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/786b653f-ad7d-4090-a4e7-8867fa11147a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.256852 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.256834 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/786b653f-ad7d-4090-a4e7-8867fa11147a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.258640 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.258624 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2f9p\" (UniqueName: \"kubernetes.io/projected/786b653f-ad7d-4090-a4e7-8867fa11147a-kube-api-access-w2f9p\") pod \"prometheus-k8s-0\" (UID: \"786b653f-ad7d-4090-a4e7-8867fa11147a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.360165 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.360121 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:15:23.484284 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.484258 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:15:23.486534 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:15:23.486496 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786b653f_ad7d_4090_a4e7_8867fa11147a.slice/crio-ba0ab79961d31e93302999567df46f3eb2f03caaa73572ac159c584f3212357b WatchSource:0}: Error finding container ba0ab79961d31e93302999567df46f3eb2f03caaa73572ac159c584f3212357b: Status 404 returned error can't find the container with id ba0ab79961d31e93302999567df46f3eb2f03caaa73572ac159c584f3212357b Apr 17 09:15:23.993446 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.993411 2567 generic.go:358] "Generic (PLEG): container finished" podID="786b653f-ad7d-4090-a4e7-8867fa11147a" containerID="d021031f66dbc6892ecbed498f1583f952ae5544912299ed1a4e8f98f1f6cf71" exitCode=0 Apr 17 09:15:23.993807 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.993473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerDied","Data":"d021031f66dbc6892ecbed498f1583f952ae5544912299ed1a4e8f98f1f6cf71"} Apr 17 09:15:23.993807 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:23.993498 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"ba0ab79961d31e93302999567df46f3eb2f03caaa73572ac159c584f3212357b"} Apr 17 09:15:24.999101 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:24.999069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"3c19d601b4954e488b490c9447694f9c867860ed5472d75c8ccd269b39754ed8"} Apr 17 09:15:24.999101 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:24.999106 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"f31ea8627dc09afa7485f17a018024efb00ba4dadbb4ccb251bc876ed702fc29"} Apr 17 09:15:24.999503 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:24.999118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"bc0f916fb92b9f4b8b3d41ccb0bd6e78141f8a0d5d8cb59df04aa6554e37eee6"} Apr 17 09:15:24.999503 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:24.999128 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"7646893f05933e5ec73a1ba80f11769340f4ce1f85e3e1778851247266d0ebf2"} Apr 17 09:15:24.999503 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:24.999158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"5a1be46794aa1e94a0122b74cedd10fde1a4aad4922f9a00a5ec4bf08d2dc78f"} Apr 17 09:15:24.999503 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:24.999167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"786b653f-ad7d-4090-a4e7-8867fa11147a","Type":"ContainerStarted","Data":"8970f9972af5ecb6170c30ec72e8cae84aed8fe1a82abfef96b33118ec274de4"} Apr 17 09:15:25.028244 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:25.028123 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.02811032 podStartE2EDuration="2.02811032s" podCreationTimestamp="2026-04-17 09:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:15:25.026241111 +0000 UTC m=+258.351059086" watchObservedRunningTime="2026-04-17 09:15:25.02811032 +0000 UTC m=+258.352928296" Apr 17 09:15:28.360289 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:15:28.360203 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:16:07.089566 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:16:07.089532 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/2.log" Apr 17 09:16:07.092955 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:16:07.092923 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/2.log" Apr 17 09:16:07.104269 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:16:07.104245 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 09:16:23.360975 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:16:23.360943 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:16:23.375317 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:16:23.375296 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:16:24.197219 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:16:24.197188 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:18:55.674512 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.674475 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs"] Apr 17 09:18:55.677794 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.677776 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.682041 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.682015 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 09:18:55.682041 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.682034 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-nfqt6\"" Apr 17 09:18:55.682254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.682044 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 09:18:55.682254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.682070 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 09:18:55.682254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.682034 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:18:55.682254 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.682098 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 09:18:55.688177 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.688154 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs"] Apr 17 09:18:55.859804 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.859776 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01f7ae8d-479d-40dd-8515-b3eb5de23c52-cert\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.859957 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.859815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lcp\" (UniqueName: \"kubernetes.io/projected/01f7ae8d-479d-40dd-8515-b3eb5de23c52-kube-api-access-29lcp\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.859957 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.859854 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/01f7ae8d-479d-40dd-8515-b3eb5de23c52-manager-config\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.859957 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.859877 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01f7ae8d-479d-40dd-8515-b3eb5de23c52-metrics-certs\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.960516 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.960483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01f7ae8d-479d-40dd-8515-b3eb5de23c52-cert\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.960638 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.960527 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29lcp\" (UniqueName: \"kubernetes.io/projected/01f7ae8d-479d-40dd-8515-b3eb5de23c52-kube-api-access-29lcp\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.960638 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.960583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/01f7ae8d-479d-40dd-8515-b3eb5de23c52-manager-config\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.960638 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.960617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01f7ae8d-479d-40dd-8515-b3eb5de23c52-metrics-certs\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.961267 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.961243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/01f7ae8d-479d-40dd-8515-b3eb5de23c52-manager-config\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.962984 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.962957 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01f7ae8d-479d-40dd-8515-b3eb5de23c52-metrics-certs\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.962984 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.962981 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01f7ae8d-479d-40dd-8515-b3eb5de23c52-cert\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.969103 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.969080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lcp\" (UniqueName: \"kubernetes.io/projected/01f7ae8d-479d-40dd-8515-b3eb5de23c52-kube-api-access-29lcp\") pod \"jobset-controller-manager-6854ddcbb-5p2bs\" (UID: \"01f7ae8d-479d-40dd-8515-b3eb5de23c52\") " pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:55.987962 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:55.987942 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:56.104117 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:56.104087 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs"] Apr 17 09:18:56.110931 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:18:56.110894 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f7ae8d_479d_40dd_8515_b3eb5de23c52.slice/crio-a1a425308c1f0ba6abc14110a0db2eb70cd647d0588985d347a2bb4525966658 WatchSource:0}: Error finding container a1a425308c1f0ba6abc14110a0db2eb70cd647d0588985d347a2bb4525966658: Status 404 returned error can't find the container with id a1a425308c1f0ba6abc14110a0db2eb70cd647d0588985d347a2bb4525966658 Apr 17 09:18:56.112673 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:56.112655 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:18:56.605310 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:56.605275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" event={"ID":"01f7ae8d-479d-40dd-8515-b3eb5de23c52","Type":"ContainerStarted","Data":"a1a425308c1f0ba6abc14110a0db2eb70cd647d0588985d347a2bb4525966658"} Apr 17 09:18:59.616705 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:59.616661 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" event={"ID":"01f7ae8d-479d-40dd-8515-b3eb5de23c52","Type":"ContainerStarted","Data":"49ac8ee6875fb50879c35be704017ddb003b432c8557cf337169787fc05c6845"} Apr 17 09:18:59.617101 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:59.616804 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:18:59.634471 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:18:59.634429 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" podStartSLOduration=1.988890639 podStartE2EDuration="4.634416288s" podCreationTimestamp="2026-04-17 09:18:55 +0000 UTC" firstStartedPulling="2026-04-17 09:18:56.112780684 +0000 UTC m=+469.437598638" lastFinishedPulling="2026-04-17 09:18:58.75830632 +0000 UTC m=+472.083124287" observedRunningTime="2026-04-17 09:18:59.632335445 +0000 UTC m=+472.957153420" watchObservedRunningTime="2026-04-17 09:18:59.634416288 +0000 UTC m=+472.959234263" Apr 17 09:19:10.624578 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:19:10.624543 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-6854ddcbb-5p2bs" Apr 17 09:20:28.391805 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:28.391777 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vqkm2_cad943ff-ebd7-4dca-aac3-600408e2153a/global-pull-secret-syncer/0.log" Apr 17 09:20:28.508029 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:28.508003 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gcm2d_3ff0b766-7823-4f42-b84c-f3c1ac91941c/konnectivity-agent/0.log" Apr 17 09:20:28.566164 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:28.566122 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-212.ec2.internal_649f656ea529e6ecb6cb9249a18750c7/haproxy/0.log" Apr 17 09:20:32.418332 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.418203 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rjhw8_0fba41d7-58a2-421f-89c9-5704fb7a073c/node-exporter/0.log" Apr 17 09:20:32.442462 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.442440 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rjhw8_0fba41d7-58a2-421f-89c9-5704fb7a073c/kube-rbac-proxy/0.log" Apr 17 09:20:32.466662 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.466640 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rjhw8_0fba41d7-58a2-421f-89c9-5704fb7a073c/init-textfile/0.log" Apr 17 09:20:32.630285 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.630253 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/prometheus/0.log" Apr 17 09:20:32.652547 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.652521 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/config-reloader/0.log" Apr 17 09:20:32.673613 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.673515 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/thanos-sidecar/0.log" Apr 17 09:20:32.696812 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.696763 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/kube-rbac-proxy-web/0.log" Apr 17 09:20:32.720003 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.719983 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/kube-rbac-proxy/0.log" Apr 17 09:20:32.744958 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.744936 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/kube-rbac-proxy-thanos/0.log" Apr 17 09:20:32.768572 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.768556 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_786b653f-ad7d-4090-a4e7-8867fa11147a/init-config-reloader/0.log" Apr 17 09:20:32.799702 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.799684 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rfsj8_b2cd0079-2e4b-4644-b18a-532ddc80ab8f/prometheus-operator/0.log" Apr 17 09:20:32.820417 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.820399 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rfsj8_b2cd0079-2e4b-4644-b18a-532ddc80ab8f/kube-rbac-proxy/0.log" Apr 17 09:20:32.847190 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:32.847169 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-b9j8t_5dd39259-8f99-4b9d-b099-673b69e16722/prometheus-operator-admission-webhook/0.log" Apr 17 09:20:33.004840 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:33.004812 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6886c85cbb-w654p_c0ca7312-1b2f-4414-baf4-0e7960c87909/thanos-query/0.log" Apr 17 09:20:33.034808 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:33.034783 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6886c85cbb-w654p_c0ca7312-1b2f-4414-baf4-0e7960c87909/kube-rbac-proxy-web/0.log" Apr 17 09:20:33.060451 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:33.060426 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6886c85cbb-w654p_c0ca7312-1b2f-4414-baf4-0e7960c87909/kube-rbac-proxy/0.log" Apr 17 09:20:33.088535 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:33.088516 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6886c85cbb-w654p_c0ca7312-1b2f-4414-baf4-0e7960c87909/prom-label-proxy/0.log" Apr 17 09:20:33.116448 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:33.116415 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6886c85cbb-w654p_c0ca7312-1b2f-4414-baf4-0e7960c87909/kube-rbac-proxy-rules/0.log" Apr 17 09:20:33.145025 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:33.145004 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6886c85cbb-w654p_c0ca7312-1b2f-4414-baf4-0e7960c87909/kube-rbac-proxy-metrics/0.log" Apr 17 09:20:34.560045 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:34.560012 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/2.log" Apr 17 09:20:34.563988 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:34.563970 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-wgk7d_02e41852-3045-415d-bbb7-fc08dc3cfe5f/console-operator/3.log" Apr 17 09:20:35.102675 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.102638 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv"] Apr 17 09:20:35.105913 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.105890 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.108932 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.108909 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-v8842\"/\"openshift-service-ca.crt\"" Apr 17 09:20:35.109314 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.109299 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-v8842\"/\"kube-root-ca.crt\"" Apr 17 09:20:35.110335 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.110317 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-v8842\"/\"default-dockercfg-mxnlg\"" Apr 17 09:20:35.122217 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.122197 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv"] Apr 17 09:20:35.227654 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.227606 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-podres\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.227654 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.227656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn9gc\" (UniqueName: \"kubernetes.io/projected/7c32ca9d-8e87-45b9-abb0-329417924681-kube-api-access-zn9gc\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.227896 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.227686 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-proc\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.227896 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.227752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-sys\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.227896 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.227803 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-lib-modules\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.328834 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.328796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-podres\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.328834 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.328837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn9gc\" (UniqueName: \"kubernetes.io/projected/7c32ca9d-8e87-45b9-abb0-329417924681-kube-api-access-zn9gc\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.328862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-proc\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.328918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-sys\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.328946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-lib-modules\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.328983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-podres\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.329018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-proc\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.329024 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-sys\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.329069 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.329053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c32ca9d-8e87-45b9-abb0-329417924681-lib-modules\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.338509 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.338481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn9gc\" (UniqueName: \"kubernetes.io/projected/7c32ca9d-8e87-45b9-abb0-329417924681-kube-api-access-zn9gc\") pod \"perf-node-gather-daemonset-7phvv\" (UID: \"7c32ca9d-8e87-45b9-abb0-329417924681\") " pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.415913 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.415859 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.446357 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.446328 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-njz5p_9b0e3808-bbdf-40aa-aa58-f60d8dae2657/volume-data-source-validator/0.log" Apr 17 09:20:35.537414 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.537360 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv"] Apr 17 09:20:35.540087 ip-10-0-128-212 kubenswrapper[2567]: W0417 09:20:35.540056 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7c32ca9d_8e87_45b9_abb0_329417924681.slice/crio-a8a71eb421e8e152d0e1c9cf37d3c6e8ed1b2ff85fc47da878a1796909b631d1 WatchSource:0}: Error finding container a8a71eb421e8e152d0e1c9cf37d3c6e8ed1b2ff85fc47da878a1796909b631d1: Status 404 returned error can't find the container with id a8a71eb421e8e152d0e1c9cf37d3c6e8ed1b2ff85fc47da878a1796909b631d1 Apr 17 09:20:35.896954 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.896922 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" event={"ID":"7c32ca9d-8e87-45b9-abb0-329417924681","Type":"ContainerStarted","Data":"bb8f4a91e7475d3185ccc51ea7a50706040b49944203ba1c726291f005f06577"} Apr 17 09:20:35.896954 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.896956 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" event={"ID":"7c32ca9d-8e87-45b9-abb0-329417924681","Type":"ContainerStarted","Data":"a8a71eb421e8e152d0e1c9cf37d3c6e8ed1b2ff85fc47da878a1796909b631d1"} Apr 17 09:20:35.897355 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.897092 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:35.916377 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:35.916334 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" podStartSLOduration=0.916320302 podStartE2EDuration="916.320302ms" podCreationTimestamp="2026-04-17 09:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:20:35.914720224 +0000 UTC m=+569.239538211" watchObservedRunningTime="2026-04-17 09:20:35.916320302 +0000 UTC m=+569.241138278" Apr 17 09:20:36.196388 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:36.196366 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x7pf8_03aa1efb-f86a-42c4-b326-2a97e8287120/dns/0.log" Apr 17 09:20:36.220315 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:36.220296 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x7pf8_03aa1efb-f86a-42c4-b326-2a97e8287120/kube-rbac-proxy/0.log" Apr 17 09:20:36.273048 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:36.273022 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8xtb4_4f93c6af-eabd-453e-9966-3199a8d4a534/dns-node-resolver/0.log" Apr 17 09:20:36.686107 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:36.686036 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54d7d6bc55-jchg9_86236ac0-ed67-423e-be76-0a7f6bcba48b/registry/0.log" Apr 17 09:20:36.757501 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:36.757476 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xx56q_8fda692b-3c65-4165-b88c-ab992a58a369/node-ca/0.log" Apr 17 09:20:37.459058 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:37.459031 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7f87cdbc66-wpc5d_1530bfd9-b6fd-487c-a92b-e509c50b4f9c/router/0.log" Apr 17 09:20:37.785030 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:37.784957 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4m6ml_f7e6e539-32e6-4df0-9447-66f765e64434/serve-healthcheck-canary/0.log" Apr 17 09:20:38.205315 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:38.205293 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4m2cs_69e56ba4-66ad-49fe-9d81-f96eb043eac7/kube-rbac-proxy/0.log" Apr 17 09:20:38.227252 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:38.227226 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4m2cs_69e56ba4-66ad-49fe-9d81-f96eb043eac7/exporter/0.log" Apr 17 09:20:38.249124 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:38.249103 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4m2cs_69e56ba4-66ad-49fe-9d81-f96eb043eac7/extractor/0.log" Apr 17 09:20:39.992279 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:39.992253 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-6854ddcbb-5p2bs_01f7ae8d-479d-40dd-8515-b3eb5de23c52/manager/0.log" Apr 17 09:20:41.909704 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:41.909677 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-v8842/perf-node-gather-daemonset-7phvv" Apr 17 09:20:43.676622 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:43.676589 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-bjjz6_fb2625d8-aa03-43bc-a98b-f09bcdd8bf65/kube-storage-version-migrator-operator/1.log" Apr 17 09:20:43.677499 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:43.677478 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-bjjz6_fb2625d8-aa03-43bc-a98b-f09bcdd8bf65/kube-storage-version-migrator-operator/0.log" Apr 17 09:20:44.724447 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:44.724416 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nv4m_528b8329-d2cd-4d99-8a3e-62f7af17b361/kube-multus/0.log" Apr 17 09:20:45.110285 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.110209 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/kube-multus-additional-cni-plugins/0.log" Apr 17 09:20:45.135131 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.135111 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/egress-router-binary-copy/0.log" Apr 17 09:20:45.158469 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.158448 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/cni-plugins/0.log" Apr 17 09:20:45.182115 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.182096 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/bond-cni-plugin/0.log" Apr 17 09:20:45.205079 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.205063 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/routeoverride-cni/0.log" Apr 17 09:20:45.229266 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.229251 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/whereabouts-cni-bincopy/0.log" Apr 17 09:20:45.251319 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.251294 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xswsv_a0d17817-9cb4-4adc-9cb1-ace0055c7639/whereabouts-cni/0.log" Apr 17 09:20:45.399968 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.399899 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qpnfd_5029b845-d556-4306-b1bb-4c6373b7e4be/network-metrics-daemon/0.log" Apr 17 09:20:45.421733 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:45.421712 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qpnfd_5029b845-d556-4306-b1bb-4c6373b7e4be/kube-rbac-proxy/0.log" Apr 17 09:20:46.226964 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.226936 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/ovn-controller/0.log" Apr 17 09:20:46.250971 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.250898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/ovn-acl-logging/0.log" Apr 17 09:20:46.278302 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.278285 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/kube-rbac-proxy-node/0.log" Apr 17 09:20:46.305466 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.305439 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 09:20:46.324903 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.324887 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/northd/0.log" Apr 17 09:20:46.347484 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.347467 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/nbdb/0.log" Apr 17 09:20:46.369581 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.369563 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/sbdb/0.log" Apr 17 09:20:46.456585 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:46.456558 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmpm_5f467c32-67b8-4e0a-b835-8b0933d2cc02/ovnkube-controller/0.log" Apr 17 09:20:48.044524 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:48.044495 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-jxg5q_014b9751-7912-4f44-a712-4b927def575d/check-endpoints/0.log" Apr 17 09:20:48.097784 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:48.097757 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qqb8r_0b55fe37-b595-4cdf-a226-39f50d91d206/network-check-target-container/0.log" Apr 17 09:20:49.006251 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:49.006223 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2jtdp_39de49a9-8aec-4d6b-a2e3-71a3ddfa6b0a/iptables-alerter/0.log" Apr 17 09:20:49.708090 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:49.708062 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x8vtz_ff136071-cfdc-4409-a0d9-ed959a609894/tuned/0.log" Apr 17 09:20:51.369031 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:51.369002 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-4wscl_70002831-91b1-409a-932e-a0ca4d141e25/cluster-samples-operator/0.log" Apr 17 09:20:51.385925 ip-10-0-128-212 kubenswrapper[2567]: I0417 09:20:51.385901 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-4wscl_70002831-91b1-409a-932e-a0ca4d141e25/cluster-samples-operator-watch/0.log"