Apr 17 07:59:59.742063 ip-10-0-138-54 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 07:59:59.742076 ip-10-0-138-54 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 07:59:59.742085 ip-10-0-138-54 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 07:59:59.742339 ip-10-0-138-54 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 08:00:09.978886 ip-10-0-138-54 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 08:00:09.978908 ip-10-0-138-54 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 269d484b106442fea44341f9596fdb92 -- Apr 17 08:02:41.794338 ip-10-0-138-54 systemd[1]: Starting Kubernetes Kubelet... Apr 17 08:02:42.259090 ip-10-0-138-54 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:42.259090 ip-10-0-138-54 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 08:02:42.259090 ip-10-0-138-54 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:42.259090 ip-10-0-138-54 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 08:02:42.259090 ip-10-0-138-54 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:42.261576 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.261484 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 08:02:42.266123 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266097 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:42.266123 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266119 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:42.266123 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266124 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:42.266123 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266129 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:42.266123 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266133 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266136 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266140 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266142 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266145 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266148 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266151 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266154 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266157 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266159 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266162 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266165 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266168 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266171 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266176 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266180 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266183 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266186 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266189 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:42.266339 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266192 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266195 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266206 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266209 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266212 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266215 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266217 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266220 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266222 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266225 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266228 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266230 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266233 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266235 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266238 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266241 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266244 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266247 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266250 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266253 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:42.266798 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266256 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266260 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266263 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266267 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266270 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266273 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266276 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266279 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266282 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266284 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266287 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266289 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266292 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266295 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266297 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266300 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266302 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266305 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266308 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:42.267300 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266311 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266314 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266317 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266319 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266322 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266324 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266327 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266330 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266334 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266337 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266340 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266343 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266346 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266348 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266351 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266354 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266358 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266360 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266363 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266366 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:42.267795 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266368 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266371 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266373 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266376 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266775 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266779 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266782 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266785 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266788 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266791 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266794 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266796 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266799 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266802 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266805 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266808 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266811 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266814 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266817 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266820 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:42.268312 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266823 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266826 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266828 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266831 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266834 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266837 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266841 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266843 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266846 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266849 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266852 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266854 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266857 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266859 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266862 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266864 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266868 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266872 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266875 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:42.268791 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266878 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266881 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266883 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266886 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266888 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266891 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266894 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266896 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266899 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266901 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266904 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266907 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266909 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266912 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266915 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266917 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266920 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266922 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266925 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266928 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:42.269287 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266930 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266933 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266936 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266938 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266941 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266943 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266946 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266948 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266951 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266953 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266956 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266958 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266961 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266964 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266966 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266969 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266972 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266976 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266978 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266981 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:42.269776 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266984 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266987 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266989 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266992 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266994 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.266998 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.267000 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.267003 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.267005 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.267008 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.267011 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268884 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268895 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268903 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268908 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268913 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268917 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268921 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268926 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268937 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268941 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 08:02:42.270272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268945 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268948 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268951 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268954 2583 flags.go:64] FLAG: --cgroup-root="" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268957 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268960 2583 flags.go:64] FLAG: --client-ca-file="" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268964 2583 flags.go:64] FLAG: --cloud-config="" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268967 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268970 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268974 2583 flags.go:64] FLAG: --cluster-domain="" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268977 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268980 2583 flags.go:64] FLAG: --config-dir="" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268983 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268986 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268990 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268994 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.268997 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269000 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269004 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269007 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269010 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269013 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269016 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269020 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269023 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 08:02:42.270790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269026 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269029 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269033 2583 flags.go:64] FLAG: --enable-server="true" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269047 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269053 2583 flags.go:64] FLAG: --event-burst="100" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269056 2583 flags.go:64] FLAG: --event-qps="50" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269059 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269062 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269065 2583 flags.go:64] FLAG: --eviction-hard="" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269069 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269072 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269076 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269079 2583 flags.go:64] FLAG: --eviction-soft="" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269082 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269085 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269088 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269091 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269094 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269097 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269100 2583 flags.go:64] FLAG: --feature-gates="" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269104 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269107 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269112 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269115 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269118 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 17 08:02:42.271431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269121 2583 flags.go:64] FLAG: --help="false" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269124 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269127 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269130 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269133 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269137 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269141 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269144 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269147 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269150 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269153 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269156 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269160 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269163 2583 flags.go:64] FLAG: --kube-reserved="" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269166 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269169 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269172 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269175 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269178 2583 flags.go:64] FLAG: --lock-file="" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269181 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269184 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269187 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269192 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 08:02:42.272030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269195 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269198 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269201 2583 flags.go:64] FLAG: --logging-format="text" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269204 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269207 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269211 2583 flags.go:64] FLAG: --manifest-url="" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269214 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269219 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269222 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269226 2583 flags.go:64] FLAG: --max-pods="110" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269229 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269232 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269235 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269238 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269241 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269244 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269247 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269255 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269258 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269261 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269270 2583 flags.go:64] FLAG: --pod-cidr="" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269274 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269280 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269283 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 08:02:42.272640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269286 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269289 2583 flags.go:64] FLAG: --port="10250" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269292 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269295 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-028201275cf08aef8" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269299 2583 flags.go:64] FLAG: --qos-reserved="" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269302 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269305 2583 flags.go:64] FLAG: --register-node="true" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269308 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269311 2583 flags.go:64] FLAG: --register-with-taints="" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269315 2583 flags.go:64] FLAG: --registry-burst="10" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269318 2583 flags.go:64] FLAG: --registry-qps="5" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269320 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269323 2583 flags.go:64] FLAG: --reserved-memory="" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269327 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269331 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269334 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269337 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269340 2583 flags.go:64] FLAG: --runonce="false" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269343 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269346 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269349 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269352 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269355 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269358 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269361 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269364 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 08:02:42.273250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269367 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269370 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269373 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269377 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269380 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269383 2583 flags.go:64] FLAG: --system-cgroups="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269385 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269391 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269399 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269401 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269406 2583 flags.go:64] FLAG: --tls-min-version="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269409 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269412 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269415 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269418 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269421 2583 flags.go:64] FLAG: --v="2" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269425 2583 flags.go:64] FLAG: --version="false" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269430 2583 flags.go:64] FLAG: --vmodule="" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269435 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.269438 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269541 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269545 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269548 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269551 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:42.273859 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269554 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269558 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269560 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269564 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269566 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269569 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269572 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269575 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269578 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269580 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269583 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269586 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269589 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269591 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269594 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269597 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269601 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269604 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269606 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:42.274436 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269609 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269611 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269614 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269617 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269619 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269622 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269625 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269627 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269630 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269633 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269636 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269639 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269641 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269644 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269646 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269649 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269652 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269654 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269657 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269659 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:42.274924 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269662 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269665 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269668 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269670 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269673 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269675 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269679 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269681 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269684 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269688 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269691 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269694 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269697 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269700 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269703 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269705 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269708 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269712 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269716 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269719 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:42.275429 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269722 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269724 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269727 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269731 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269735 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269738 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269741 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269743 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269746 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269749 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269751 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269754 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269757 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269759 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269762 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269764 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269767 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269769 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269772 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269775 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:42.276060 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269778 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:42.276566 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269781 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:42.276566 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.269785 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:42.276566 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.270815 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:42.277388 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.277366 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 08:02:42.277430 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.277389 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 08:02:42.277457 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277441 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:42.277457 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277446 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:42.277457 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277450 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:42.277457 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277453 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:42.277457 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277457 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277460 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277464 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277467 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277469 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277472 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277475 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277478 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277481 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277484 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277486 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277489 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277492 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277495 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277497 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277500 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277503 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277506 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277509 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277511 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:42.277584 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277514 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277517 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277520 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277523 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277526 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277528 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277531 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277534 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277536 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277539 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277541 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277544 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277546 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277549 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277553 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277556 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277558 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277561 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277564 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277567 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:42.278092 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277570 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277572 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277575 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277578 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277581 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277584 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277586 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277589 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277592 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277595 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277598 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277601 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277604 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277606 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277611 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277615 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277618 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277620 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277623 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277626 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:42.278583 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277628 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277632 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277635 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277638 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277641 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277644 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277647 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277650 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277653 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277656 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277658 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277661 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277664 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277666 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277669 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277672 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277675 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277677 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277680 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:42.279091 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277682 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277685 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277687 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.277693 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277788 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277793 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277796 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277799 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277803 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277806 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277809 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277811 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277814 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277817 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277820 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277822 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:42.279555 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277825 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277827 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277830 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277834 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277837 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277840 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277843 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277845 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277848 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277851 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277853 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277856 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277858 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277861 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277864 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277866 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277868 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277871 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277873 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:42.279961 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277876 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277879 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277881 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277884 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277886 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277889 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277892 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277895 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277897 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277900 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277902 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277905 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277908 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277910 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277913 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277915 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277919 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277922 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277924 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277927 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:42.280483 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277929 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277932 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277934 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277937 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277940 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277942 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277945 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277947 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277950 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277953 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277955 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277958 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277960 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277963 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277965 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277968 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277970 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277973 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277975 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277979 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:42.280984 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277981 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277984 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277986 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277989 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277992 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277994 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.277998 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278002 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278005 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278009 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278012 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278015 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278018 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278021 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:42.278024 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:42.281486 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.278029 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:42.281854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.278953 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 08:02:42.282620 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.282606 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 08:02:42.283606 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.283585 2583 server.go:1019] "Starting client certificate rotation" Apr 17 08:02:42.283706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.283688 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 08:02:42.283739 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.283730 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 08:02:42.310301 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.310280 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 08:02:42.312873 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.312853 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 08:02:42.328670 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.328647 2583 log.go:25] "Validated CRI v1 runtime API" Apr 17 08:02:42.334835 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.334819 2583 log.go:25] "Validated CRI v1 image API" Apr 17 08:02:42.336023 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.336004 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 08:02:42.338332 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.338310 2583 fs.go:135] Filesystem UUIDs: map[528db7e4-8607-4d52-a5ed-f7ea15d15646:/dev/nvme0n1p4 70e52bad-2987-40c2-8f6a-c4638821e579:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 08:02:42.338396 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.338333 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 08:02:42.339906 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.339889 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 08:02:42.344826 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.344709 2583 manager.go:217] Machine: {Timestamp:2026-04-17 08:02:42.342520004 +0000 UTC m=+0.429302684 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099116 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24b4965a55fe9264d5fcd99311a6e3 SystemUUID:ec24b496-5a55-fe92-64d5-fcd99311a6e3 BootID:269d484b-1064-42fe-a443-41f9596fdb92 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a2:81:3a:60:d9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a2:81:3a:60:d9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:cb:dc:b6:69:be Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 08:02:42.344826 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.344821 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 08:02:42.344966 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.344954 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 08:02:42.346056 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.346019 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 08:02:42.346199 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.346059 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-54.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 08:02:42.346242 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.346205 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 08:02:42.346242 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.346214 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 08:02:42.346242 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.346226 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 08:02:42.347740 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.347729 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 08:02:42.349190 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.349179 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 17 08:02:42.349305 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.349296 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 08:02:42.351962 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.351951 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 17 08:02:42.352010 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.351965 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 08:02:42.352010 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.351977 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 08:02:42.352010 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.351986 2583 kubelet.go:397] "Adding apiserver pod source" Apr 17 08:02:42.352010 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.351995 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 08:02:42.353100 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.353088 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 08:02:42.353141 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.353105 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 08:02:42.356225 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.356203 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cvngb" Apr 17 08:02:42.357084 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.357068 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 08:02:42.361414 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.361381 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 08:02:42.363065 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363031 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cvngb" Apr 17 08:02:42.363626 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363612 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363633 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363648 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363654 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363661 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363672 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363679 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363688 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363699 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 08:02:42.363707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363710 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 08:02:42.363995 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363737 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 08:02:42.363995 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363751 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 08:02:42.363995 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363787 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 08:02:42.363995 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.363797 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 08:02:42.363995 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.363941 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 08:02:42.364159 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.364021 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-54.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 08:02:42.367676 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.367661 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 08:02:42.367739 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.367701 2583 server.go:1295] "Started kubelet" Apr 17 08:02:42.367838 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.367789 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 08:02:42.367908 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.367851 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 08:02:42.368023 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.367928 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 08:02:42.368733 ip-10-0-138-54 systemd[1]: Started Kubernetes Kubelet. Apr 17 08:02:42.368879 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.368807 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 17 08:02:42.369410 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.369250 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 08:02:42.373256 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.373234 2583 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-54.ec2.internal" not found Apr 17 08:02:42.374274 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.374257 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 08:02:42.374774 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.374754 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 08:02:42.375474 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375459 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 08:02:42.375584 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375567 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 08:02:42.375674 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375641 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 08:02:42.375674 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.375638 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.375764 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375698 2583 factory.go:153] Registering CRI-O factory Apr 17 08:02:42.375764 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375720 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 17 08:02:42.375764 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375730 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 17 08:02:42.375764 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375760 2583 factory.go:223] Registration of the crio container factory successfully Apr 17 08:02:42.375962 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375805 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 08:02:42.375962 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375812 2583 factory.go:55] Registering systemd factory Apr 17 08:02:42.375962 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375818 2583 factory.go:223] Registration of the systemd container factory successfully Apr 17 08:02:42.375962 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375834 2583 factory.go:103] Registering Raw factory Apr 17 08:02:42.375962 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.375846 2583 manager.go:1196] Started watching for new ooms in manager Apr 17 08:02:42.376220 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.376181 2583 manager.go:319] Starting recovery of all containers Apr 17 08:02:42.376683 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.376665 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 08:02:42.377519 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.377503 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:42.380482 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.380459 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-54.ec2.internal\" not found" node="ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.383557 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.383515 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 08:02:42.385186 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.385171 2583 manager.go:324] Recovery completed Apr 17 08:02:42.388351 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.388331 2583 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-54.ec2.internal" not found Apr 17 08:02:42.391475 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.391461 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:42.394435 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.394420 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:42.394502 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.394448 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:42.394502 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.394459 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:42.394951 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.394935 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 08:02:42.394951 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.394948 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 08:02:42.395073 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.394965 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 17 08:02:42.397343 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.397331 2583 policy_none.go:49] "None policy: Start" Apr 17 08:02:42.397390 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.397349 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 08:02:42.397390 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.397358 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 17 08:02:42.431636 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.431617 2583 manager.go:341] "Starting Device Plugin manager" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.431653 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.431666 2583 server.go:85] "Starting device plugin registration server" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.431953 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.431975 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.432147 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.432248 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.432264 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.432797 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.432832 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.446335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.444058 2583 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-54.ec2.internal" not found Apr 17 08:02:42.509486 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.509404 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 08:02:42.509486 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.509439 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 08:02:42.509486 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.509458 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 08:02:42.509486 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.509464 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 08:02:42.509707 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.509531 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 08:02:42.513333 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.513309 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:42.532288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.532268 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:42.533299 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.533281 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:42.533355 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.533317 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:42.533355 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.533329 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:42.533355 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.533354 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.541921 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.541904 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.541969 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.541925 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-54.ec2.internal\": node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.558815 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.558791 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.610507 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.610462 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal"] Apr 17 08:02:42.610588 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.610560 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:42.611509 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.611494 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:42.611618 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.611523 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:42.611618 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.611535 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:42.612772 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.612761 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:42.612907 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.612894 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.612948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.612924 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:42.613455 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.613434 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:42.613567 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.613463 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:42.613567 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.613477 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:42.613567 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.613520 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:42.613567 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.613545 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:42.613567 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.613555 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:42.614577 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.614564 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.614630 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.614588 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:42.615223 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.615211 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:42.615279 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.615235 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:42.615279 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.615249 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:42.640731 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.640709 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-54.ec2.internal\" not found" node="ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.645310 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.645290 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-54.ec2.internal\" not found" node="ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.659298 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.659262 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.678432 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.678400 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51f9f233d352b5a7748f43e51cb52994-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal\" (UID: \"51f9f233d352b5a7748f43e51cb52994\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.678569 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.678461 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51f9f233d352b5a7748f43e51cb52994-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal\" (UID: \"51f9f233d352b5a7748f43e51cb52994\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.678569 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.678491 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/db15b079396bd5a2463864d9d840da9c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-54.ec2.internal\" (UID: \"db15b079396bd5a2463864d9d840da9c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.759707 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.759619 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.779021 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.778992 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51f9f233d352b5a7748f43e51cb52994-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal\" (UID: \"51f9f233d352b5a7748f43e51cb52994\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.779139 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.779033 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/db15b079396bd5a2463864d9d840da9c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-54.ec2.internal\" (UID: \"db15b079396bd5a2463864d9d840da9c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.779139 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.779106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51f9f233d352b5a7748f43e51cb52994-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal\" (UID: \"51f9f233d352b5a7748f43e51cb52994\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.779139 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.779111 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51f9f233d352b5a7748f43e51cb52994-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal\" (UID: \"51f9f233d352b5a7748f43e51cb52994\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.779256 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.779148 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51f9f233d352b5a7748f43e51cb52994-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal\" (UID: \"51f9f233d352b5a7748f43e51cb52994\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.779256 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.779119 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/db15b079396bd5a2463864d9d840da9c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-54.ec2.internal\" (UID: \"db15b079396bd5a2463864d9d840da9c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.860452 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.860419 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:42.942989 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.942956 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.947655 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:42.947638 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" Apr 17 08:02:42.961488 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:42.961341 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:43.061976 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:43.061886 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:43.162382 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:43.162355 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:43.262935 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:43.262904 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:43.283398 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.283370 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 08:02:43.283540 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.283514 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 08:02:43.283600 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.283556 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 08:02:43.364066 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:43.363980 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-54.ec2.internal\" not found" Apr 17 08:02:43.365090 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.365034 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:57:42 +0000 UTC" deadline="2027-11-24 04:34:44.295713768 +0000 UTC" Apr 17 08:02:43.365090 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.365085 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14060h32m0.930633318s" Apr 17 08:02:43.375211 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.375185 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 08:02:43.387102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.387070 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:43.393209 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.393185 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 08:02:43.415528 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.415499 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cfm26" Apr 17 08:02:43.422733 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.422710 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cfm26" Apr 17 08:02:43.474268 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:43.474231 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb15b079396bd5a2463864d9d840da9c.slice/crio-379d47295c3d7a4c1d2f023c8ea0216a1fa8b627393c54d97c550483f995c160 WatchSource:0}: Error finding container 379d47295c3d7a4c1d2f023c8ea0216a1fa8b627393c54d97c550483f995c160: Status 404 returned error can't find the container with id 379d47295c3d7a4c1d2f023c8ea0216a1fa8b627393c54d97c550483f995c160 Apr 17 08:02:43.474723 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:43.474704 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f9f233d352b5a7748f43e51cb52994.slice/crio-2e17f1bafa98510f0a8a6455d559c9f96226c3e80e9f572c35f99cbe8d3309f4 WatchSource:0}: Error finding container 2e17f1bafa98510f0a8a6455d559c9f96226c3e80e9f572c35f99cbe8d3309f4: Status 404 returned error can't find the container with id 2e17f1bafa98510f0a8a6455d559c9f96226c3e80e9f572c35f99cbe8d3309f4 Apr 17 08:02:43.475799 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.475783 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" Apr 17 08:02:43.479237 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.479220 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:02:43.487262 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.487244 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 08:02:43.489110 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.489094 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" Apr 17 08:02:43.496382 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.496363 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 08:02:43.512345 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.512301 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" event={"ID":"51f9f233d352b5a7748f43e51cb52994","Type":"ContainerStarted","Data":"2e17f1bafa98510f0a8a6455d559c9f96226c3e80e9f572c35f99cbe8d3309f4"} Apr 17 08:02:43.513252 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.513232 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" event={"ID":"db15b079396bd5a2463864d9d840da9c","Type":"ContainerStarted","Data":"379d47295c3d7a4c1d2f023c8ea0216a1fa8b627393c54d97c550483f995c160"} Apr 17 08:02:43.722056 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:43.722021 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:44.353478 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.353436 2583 apiserver.go:52] "Watching apiserver" Apr 17 08:02:44.358760 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.358722 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 08:02:44.359175 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.359138 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6","openshift-cluster-node-tuning-operator/tuned-nzxn9","openshift-dns/node-resolver-q5zfz","openshift-image-registry/node-ca-pk77j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal","openshift-multus/multus-gn8ng","openshift-multus/network-metrics-daemon-5zj7l","kube-system/konnectivity-agent-sptg6","openshift-multus/multus-additional-cni-plugins-s742h","openshift-network-diagnostics/network-check-target-52zv7","openshift-network-operator/iptables-alerter-lr452","openshift-ovn-kubernetes/ovnkube-node-qrq7z"] Apr 17 08:02:44.361393 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.361313 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.362616 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.362595 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.363729 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.363584 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 08:02:44.363729 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.363585 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 08:02:44.363903 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.363776 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.363974 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.363955 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.364027 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.364015 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lv2z6\"" Apr 17 08:02:44.364403 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.364379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 08:02:44.364505 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.364419 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.364618 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.364564 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p566c\"" Apr 17 08:02:44.364618 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.364592 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.365052 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.365016 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.366441 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.366421 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.366549 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.366489 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.366685 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.366659 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lx6b2\"" Apr 17 08:02:44.366925 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.366909 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.367701 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.367681 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.368029 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.368013 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.368122 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.368071 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.368356 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.368339 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bqp68\"" Apr 17 08:02:44.368904 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.368885 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.369008 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.368979 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:44.369285 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.369267 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.369693 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.369668 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.369770 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.369751 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 08:02:44.369809 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.369752 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xm7h6\"" Apr 17 08:02:44.370138 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.370083 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.370251 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.370233 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.371485 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.371462 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:44.371555 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.371516 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:44.372534 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.371818 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ljsjb\"" Apr 17 08:02:44.372534 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.371877 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-d6vrr\"" Apr 17 08:02:44.372534 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.371936 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 08:02:44.372534 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.372067 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 08:02:44.372534 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.372070 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 08:02:44.372948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.372847 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 08:02:44.374188 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.373099 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.375418 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.375397 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 08:02:44.375518 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.375448 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.376763 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.376739 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6lrwt\"" Apr 17 08:02:44.376850 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.376765 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.376850 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.376798 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.377573 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.377301 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 08:02:44.377573 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.377373 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 08:02:44.377573 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.377474 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 08:02:44.377573 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.377498 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 08:02:44.377812 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.377586 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 08:02:44.377812 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.377755 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 08:02:44.378290 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.378272 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2wrzj\"" Apr 17 08:02:44.378379 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.378313 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 08:02:44.387350 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387325 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rcb\" (UniqueName: \"kubernetes.io/projected/3a741811-1651-45b8-97c3-04e9be19b874-kube-api-access-s2rcb\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.387449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387365 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt4kk\" (UniqueName: \"kubernetes.io/projected/5f3f7f47-4941-40cd-88d3-259605376e0e-kube-api-access-rt4kk\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.387449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387394 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71e69b48-555a-432e-b12a-16dae2f11d3c-konnectivity-ca\") pod \"konnectivity-agent-sptg6\" (UID: \"71e69b48-555a-432e-b12a-16dae2f11d3c\") " pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.387449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387420 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.387449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387443 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-cni-multus\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387488 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-cni-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387521 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-tuned\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387548 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-netns\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387571 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-multus-certs\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387596 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsd7h\" (UniqueName: \"kubernetes.io/projected/d961e954-15d3-43c1-800e-ea0f8d3c806d-kube-api-access-vsd7h\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387619 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f3f7f47-4941-40cd-88d3-259605376e0e-serviceca\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.387657 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387643 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/89794c28-aca4-44e3-899f-a486defd216d-iptables-alerter-script\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387692 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-kubelet\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387727 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-env-overrides\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387753 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-systemd\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387778 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387804 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-os-release\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387826 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-kubelet\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387849 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3f7f47-4941-40cd-88d3-259605376e0e-host\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387872 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-cni-netd\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387899 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-kubelet-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387923 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-socket-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387948 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-modprobe-d\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.387972 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-conf-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388011 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-etc-kubernetes\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388058 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71e69b48-555a-432e-b12a-16dae2f11d3c-agent-certs\") pod \"konnectivity-agent-sptg6\" (UID: \"71e69b48-555a-432e-b12a-16dae2f11d3c\") " pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388087 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysconfig\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388113 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-kubernetes\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388130 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-ovnkube-config\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388155 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-sys-fs\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388181 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-slash\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388204 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-cnibin\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388253 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-cni-binary-copy\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388275 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388297 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-k8s-cni-cncf-io\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388322 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-cni-bin\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388346 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388392 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysctl-d\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-run\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388455 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-sys\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388499 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-lib-modules\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388531 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.388706 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-systemd\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388577 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388597 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhz5\" (UniqueName: \"kubernetes.io/projected/ad1b5fde-cf28-4a54-bb33-cb43f425421e-kube-api-access-6lhz5\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388620 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-cni-bin\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388645 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-ovn\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388677 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-var-lib-kubelet\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-os-release\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388726 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-socket-dir-parent\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388753 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysctl-conf\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388797 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388841 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-daemon-config\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388874 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhmnl\" (UniqueName: \"kubernetes.io/projected/e0b1c45e-6f25-419a-b20e-afc106313813-kube-api-access-fhmnl\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388900 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-run-netns\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388949 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-registration-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.388983 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-hostroot\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389005 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-log-socket\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.389529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389030 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfxpp\" (UniqueName: \"kubernetes.io/projected/a9582e01-007d-4cf4-9287-1788560d38e1-kube-api-access-mfxpp\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389067 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8a42681-cd32-4735-af7b-2c1ca0b2df67-tmp\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389120 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ad1b5fde-cf28-4a54-bb33-cb43f425421e-hosts-file\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389148 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-device-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389167 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-etc-selinux\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389184 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-system-cni-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389198 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-cnibin\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389230 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d961e954-15d3-43c1-800e-ea0f8d3c806d-cni-binary-copy\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389249 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-etc-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389274 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89794c28-aca4-44e3-899f-a486defd216d-host-slash\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389300 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtp97\" (UniqueName: \"kubernetes.io/projected/89794c28-aca4-44e3-899f-a486defd216d-kube-api-access-xtp97\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389322 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-systemd-units\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389364 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-var-lib-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389392 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwg6\" (UniqueName: \"kubernetes.io/projected/e8a42681-cd32-4735-af7b-2c1ca0b2df67-kube-api-access-4bwg6\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389416 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad1b5fde-cf28-4a54-bb33-cb43f425421e-tmp-dir\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389440 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6px\" (UniqueName: \"kubernetes.io/projected/a358126d-f138-41e0-b8a2-598652e544f5-kube-api-access-cn6px\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.390178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389463 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-node-log\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390837 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389485 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9582e01-007d-4cf4-9287-1788560d38e1-ovn-node-metrics-cert\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390837 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389537 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-ovnkube-script-lib\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.390837 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389565 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-host\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.390837 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.389587 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-system-cni-dir\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.423393 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.423351 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:57:43 +0000 UTC" deadline="2027-12-27 20:10:58.35512608 +0000 UTC" Apr 17 08:02:44.423393 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.423379 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14868h8m13.931750186s" Apr 17 08:02:44.490373 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490338 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt4kk\" (UniqueName: \"kubernetes.io/projected/5f3f7f47-4941-40cd-88d3-259605376e0e-kube-api-access-rt4kk\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.490373 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490372 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71e69b48-555a-432e-b12a-16dae2f11d3c-konnectivity-ca\") pod \"konnectivity-agent-sptg6\" (UID: \"71e69b48-555a-432e-b12a-16dae2f11d3c\") " pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490390 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490445 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-cni-multus\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490461 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-cni-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490481 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-tuned\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490506 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-netns\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490532 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-multus-certs\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490538 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-cni-multus\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490582 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-cni-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.490594 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:44.490622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490593 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-multus-certs\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490625 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsd7h\" (UniqueName: \"kubernetes.io/projected/d961e954-15d3-43c1-800e-ea0f8d3c806d-kube-api-access-vsd7h\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490663 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f3f7f47-4941-40cd-88d3-259605376e0e-serviceca\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490668 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-netns\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490701 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/89794c28-aca4-44e3-899f-a486defd216d-iptables-alerter-script\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490727 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-kubelet\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.490757 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:44.990721056 +0000 UTC m=+3.077503740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490893 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-env-overrides\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490924 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-systemd\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490950 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490977 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-os-release\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.490976 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491001 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-kubelet\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491025 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3f7f47-4941-40cd-88d3-259605376e0e-host\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491057 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71e69b48-555a-432e-b12a-16dae2f11d3c-konnectivity-ca\") pod \"konnectivity-agent-sptg6\" (UID: \"71e69b48-555a-432e-b12a-16dae2f11d3c\") " pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-cni-netd\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491092 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-kubelet-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.491120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491117 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-socket-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491142 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-modprobe-d\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491152 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-os-release\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491167 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-conf-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491166 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f3f7f47-4941-40cd-88d3-259605376e0e-serviceca\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491192 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-etc-kubernetes\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491201 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-systemd\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491218 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71e69b48-555a-432e-b12a-16dae2f11d3c-agent-certs\") pod \"konnectivity-agent-sptg6\" (UID: \"71e69b48-555a-432e-b12a-16dae2f11d3c\") " pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491231 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-kubelet\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491236 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/89794c28-aca4-44e3-899f-a486defd216d-iptables-alerter-script\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491279 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-kubelet\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491294 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysconfig\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491304 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f3f7f47-4941-40cd-88d3-259605376e0e-host\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491322 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-kubernetes\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491339 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-etc-kubernetes\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491343 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-conf-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491345 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-modprobe-d\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491348 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-ovnkube-config\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.491968 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491379 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-env-overrides\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491387 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-sys-fs\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491424 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-sys-fs\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491434 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-slash\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491452 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-cni-netd\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491453 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-socket-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491461 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491488 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysconfig\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491497 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-cnibin\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491497 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491525 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-cni-binary-copy\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491574 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491594 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-cnibin\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491602 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-k8s-cni-cncf-io\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491620 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-cni-bin\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491640 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491668 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysctl-d\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.492811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491693 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-run\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491694 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-kubelet-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491716 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-sys\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491741 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-lib-modules\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491768 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491775 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-kubernetes\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491741 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-slash\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491781 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-run-k8s-cni-cncf-io\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491740 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491803 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-systemd\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491854 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-sys\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-lib-modules\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491894 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-run\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491905 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysctl-d\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491930 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-systemd\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491937 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-cni-bin\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491939 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.493649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491965 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491980 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhz5\" (UniqueName: \"kubernetes.io/projected/ad1b5fde-cf28-4a54-bb33-cb43f425421e-kube-api-access-6lhz5\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.491990 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492009 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-cni-bin\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492055 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-ovn\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492061 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-cni-binary-copy\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492081 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-var-lib-kubelet\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-os-release\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492110 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-run-ovn\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492082 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-host-var-lib-cni-bin\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492131 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-socket-dir-parent\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492132 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-var-lib-kubelet\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492154 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysctl-conf\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492177 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-socket-dir-parent\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492177 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-os-release\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492181 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492208 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-daemon-config\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492229 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhmnl\" (UniqueName: \"kubernetes.io/projected/e0b1c45e-6f25-419a-b20e-afc106313813-kube-api-access-fhmnl\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492234 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492252 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-run-netns\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492282 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-sysctl-conf\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492292 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-registration-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492329 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-host-run-netns\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492319 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-hostroot\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492378 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-log-socket\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492386 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-registration-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfxpp\" (UniqueName: \"kubernetes.io/projected/a9582e01-007d-4cf4-9287-1788560d38e1-kube-api-access-mfxpp\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492437 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-log-socket\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492440 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-hostroot\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492465 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8a42681-cd32-4735-af7b-2c1ca0b2df67-tmp\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492491 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ad1b5fde-cf28-4a54-bb33-cb43f425421e-hosts-file\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-device-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492540 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-etc-selinux\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492551 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ad1b5fde-cf28-4a54-bb33-cb43f425421e-hosts-file\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492574 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-system-cni-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.494917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492599 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-cnibin\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492602 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-device-dir\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492606 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e0b1c45e-6f25-419a-b20e-afc106313813-etc-selinux\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492634 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d961e954-15d3-43c1-800e-ea0f8d3c806d-cni-binary-copy\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492652 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-cnibin\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492655 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d961e954-15d3-43c1-800e-ea0f8d3c806d-system-cni-dir\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492662 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-etc-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492693 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89794c28-aca4-44e3-899f-a486defd216d-host-slash\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492694 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a741811-1651-45b8-97c3-04e9be19b874-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-etc-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492737 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89794c28-aca4-44e3-899f-a486defd216d-host-slash\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492747 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtp97\" (UniqueName: \"kubernetes.io/projected/89794c28-aca4-44e3-899f-a486defd216d-kube-api-access-xtp97\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492782 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d961e954-15d3-43c1-800e-ea0f8d3c806d-multus-daemon-config\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492851 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-systemd-units\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492885 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-var-lib-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492904 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bwg6\" (UniqueName: \"kubernetes.io/projected/e8a42681-cd32-4735-af7b-2c1ca0b2df67-kube-api-access-4bwg6\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492937 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad1b5fde-cf28-4a54-bb33-cb43f425421e-tmp-dir\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.495653 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492946 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-systemd-units\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492953 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-var-lib-openvswitch\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492963 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6px\" (UniqueName: \"kubernetes.io/projected/a358126d-f138-41e0-b8a2-598652e544f5-kube-api-access-cn6px\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.492987 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-node-log\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493011 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9582e01-007d-4cf4-9287-1788560d38e1-ovn-node-metrics-cert\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493050 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-ovnkube-script-lib\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493066 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-ovnkube-config\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493075 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-host\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493080 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d961e954-15d3-43c1-800e-ea0f8d3c806d-cni-binary-copy\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493109 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-system-cni-dir\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493132 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a9582e01-007d-4cf4-9287-1788560d38e1-node-log\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493133 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rcb\" (UniqueName: \"kubernetes.io/projected/3a741811-1651-45b8-97c3-04e9be19b874-kube-api-access-s2rcb\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493260 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad1b5fde-cf28-4a54-bb33-cb43f425421e-tmp-dir\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493327 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8a42681-cd32-4735-af7b-2c1ca0b2df67-host\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493346 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a741811-1651-45b8-97c3-04e9be19b874-system-cni-dir\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.493539 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a9582e01-007d-4cf4-9287-1788560d38e1-ovnkube-script-lib\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.495795 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e8a42681-cd32-4735-af7b-2c1ca0b2df67-tmp\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.496341 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.495927 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e8a42681-cd32-4735-af7b-2c1ca0b2df67-etc-tuned\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.496925 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.496548 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71e69b48-555a-432e-b12a-16dae2f11d3c-agent-certs\") pod \"konnectivity-agent-sptg6\" (UID: \"71e69b48-555a-432e-b12a-16dae2f11d3c\") " pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.496925 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.496604 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9582e01-007d-4cf4-9287-1788560d38e1-ovn-node-metrics-cert\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.501568 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.501538 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:44.501694 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.501566 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:44.501694 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.501591 2583 projected.go:194] Error preparing data for projected volume kube-api-access-8bt5l for pod openshift-network-diagnostics/network-check-target-52zv7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:44.501813 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.501717 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l podName:d7bf9c6a-1777-45f6-9f27-2ee3d09959d1 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:45.0016933 +0000 UTC m=+3.088475980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8bt5l" (UniqueName: "kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l") pod "network-check-target-52zv7" (UID: "d7bf9c6a-1777-45f6-9f27-2ee3d09959d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:44.501813 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.501778 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsd7h\" (UniqueName: \"kubernetes.io/projected/d961e954-15d3-43c1-800e-ea0f8d3c806d-kube-api-access-vsd7h\") pod \"multus-gn8ng\" (UID: \"d961e954-15d3-43c1-800e-ea0f8d3c806d\") " pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.502554 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.502504 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhz5\" (UniqueName: \"kubernetes.io/projected/ad1b5fde-cf28-4a54-bb33-cb43f425421e-kube-api-access-6lhz5\") pod \"node-resolver-q5zfz\" (UID: \"ad1b5fde-cf28-4a54-bb33-cb43f425421e\") " pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.502892 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.502869 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt4kk\" (UniqueName: \"kubernetes.io/projected/5f3f7f47-4941-40cd-88d3-259605376e0e-kube-api-access-rt4kk\") pod \"node-ca-pk77j\" (UID: \"5f3f7f47-4941-40cd-88d3-259605376e0e\") " pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.503171 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.503150 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bwg6\" (UniqueName: \"kubernetes.io/projected/e8a42681-cd32-4735-af7b-2c1ca0b2df67-kube-api-access-4bwg6\") pod \"tuned-nzxn9\" (UID: \"e8a42681-cd32-4735-af7b-2c1ca0b2df67\") " pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.503739 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.503717 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6px\" (UniqueName: \"kubernetes.io/projected/a358126d-f138-41e0-b8a2-598652e544f5-kube-api-access-cn6px\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.504196 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.504176 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhmnl\" (UniqueName: \"kubernetes.io/projected/e0b1c45e-6f25-419a-b20e-afc106313813-kube-api-access-fhmnl\") pod \"aws-ebs-csi-driver-node-56fr6\" (UID: \"e0b1c45e-6f25-419a-b20e-afc106313813\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.504351 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.504328 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rcb\" (UniqueName: \"kubernetes.io/projected/3a741811-1651-45b8-97c3-04e9be19b874-kube-api-access-s2rcb\") pod \"multus-additional-cni-plugins-s742h\" (UID: \"3a741811-1651-45b8-97c3-04e9be19b874\") " pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.504417 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.504375 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfxpp\" (UniqueName: \"kubernetes.io/projected/a9582e01-007d-4cf4-9287-1788560d38e1-kube-api-access-mfxpp\") pod \"ovnkube-node-qrq7z\" (UID: \"a9582e01-007d-4cf4-9287-1788560d38e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.504838 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.504824 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtp97\" (UniqueName: \"kubernetes.io/projected/89794c28-aca4-44e3-899f-a486defd216d-kube-api-access-xtp97\") pod \"iptables-alerter-lr452\" (UID: \"89794c28-aca4-44e3-899f-a486defd216d\") " pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.565205 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.565174 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:44.589865 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.589834 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:44.676532 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.676503 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gn8ng" Apr 17 08:02:44.684321 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.684297 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" Apr 17 08:02:44.692721 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.692701 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q5zfz" Apr 17 08:02:44.699430 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.699411 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" Apr 17 08:02:44.707976 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.707954 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pk77j" Apr 17 08:02:44.713575 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.713554 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:02:44.720132 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.720111 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s742h" Apr 17 08:02:44.726739 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.726711 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lr452" Apr 17 08:02:44.732351 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.732328 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:02:44.997263 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:44.997200 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:44.997431 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.997293 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:44.997431 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:44.997342 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:45.997328637 +0000 UTC m=+4.084111301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:45.054707 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.054677 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f3f7f47_4941_40cd_88d3_259605376e0e.slice/crio-657cbbfd8f196ac72a4b86380bf442a863b58d5ebd4db02b20e7bdecddf74196 WatchSource:0}: Error finding container 657cbbfd8f196ac72a4b86380bf442a863b58d5ebd4db02b20e7bdecddf74196: Status 404 returned error can't find the container with id 657cbbfd8f196ac72a4b86380bf442a863b58d5ebd4db02b20e7bdecddf74196 Apr 17 08:02:45.056371 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.056288 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd961e954_15d3_43c1_800e_ea0f8d3c806d.slice/crio-1bcf41e530baab554ed9c12019f178ee81b12482e15c5c6378a58d2d80ce065f WatchSource:0}: Error finding container 1bcf41e530baab554ed9c12019f178ee81b12482e15c5c6378a58d2d80ce065f: Status 404 returned error can't find the container with id 1bcf41e530baab554ed9c12019f178ee81b12482e15c5c6378a58d2d80ce065f Apr 17 08:02:45.059480 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.059456 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71e69b48_555a_432e_b12a_16dae2f11d3c.slice/crio-e699d589a6c08d287e1b25747b7ee84cfb86b081908221be202e17b5ee30f8b4 WatchSource:0}: Error finding container e699d589a6c08d287e1b25747b7ee84cfb86b081908221be202e17b5ee30f8b4: Status 404 returned error can't find the container with id e699d589a6c08d287e1b25747b7ee84cfb86b081908221be202e17b5ee30f8b4 Apr 17 08:02:45.060469 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.060441 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0b1c45e_6f25_419a_b20e_afc106313813.slice/crio-6f18ee8849444723f6bc0a0d53715ee0fce31a3430679b10ff876bca880b8011 WatchSource:0}: Error finding container 6f18ee8849444723f6bc0a0d53715ee0fce31a3430679b10ff876bca880b8011: Status 404 returned error can't find the container with id 6f18ee8849444723f6bc0a0d53715ee0fce31a3430679b10ff876bca880b8011 Apr 17 08:02:45.061236 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.061216 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1b5fde_cf28_4a54_bb33_cb43f425421e.slice/crio-1637e57c3d85e7e3b7beeb564d0c69eef370636c7eb157a358bb7938076f8703 WatchSource:0}: Error finding container 1637e57c3d85e7e3b7beeb564d0c69eef370636c7eb157a358bb7938076f8703: Status 404 returned error can't find the container with id 1637e57c3d85e7e3b7beeb564d0c69eef370636c7eb157a358bb7938076f8703 Apr 17 08:02:45.062449 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.062413 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a741811_1651_45b8_97c3_04e9be19b874.slice/crio-024996e2215b55ef02393e145b0fc57913fefe299ce3cfc7ef1a0c1481e2174b WatchSource:0}: Error finding container 024996e2215b55ef02393e145b0fc57913fefe299ce3cfc7ef1a0c1481e2174b: Status 404 returned error can't find the container with id 024996e2215b55ef02393e145b0fc57913fefe299ce3cfc7ef1a0c1481e2174b Apr 17 08:02:45.063175 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.063116 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a42681_cd32_4735_af7b_2c1ca0b2df67.slice/crio-1bc29b98e8af37722fb841690c8d0831ecb189ba5113dc0117675c1f9510193a WatchSource:0}: Error finding container 1bc29b98e8af37722fb841690c8d0831ecb189ba5113dc0117675c1f9510193a: Status 404 returned error can't find the container with id 1bc29b98e8af37722fb841690c8d0831ecb189ba5113dc0117675c1f9510193a Apr 17 08:02:45.064530 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.064507 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89794c28_aca4_44e3_899f_a486defd216d.slice/crio-f2b2b5ceefd8bce0bff52f102410d3e879545488b626a9e309b54fcc431abe0c WatchSource:0}: Error finding container f2b2b5ceefd8bce0bff52f102410d3e879545488b626a9e309b54fcc431abe0c: Status 404 returned error can't find the container with id f2b2b5ceefd8bce0bff52f102410d3e879545488b626a9e309b54fcc431abe0c Apr 17 08:02:45.065503 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:02:45.065475 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9582e01_007d_4cf4_9287_1788560d38e1.slice/crio-153c4992ce15c4cb922c4999808ef785edbead9ba3d5941eb19cb742eb6229df WatchSource:0}: Error finding container 153c4992ce15c4cb922c4999808ef785edbead9ba3d5941eb19cb742eb6229df: Status 404 returned error can't find the container with id 153c4992ce15c4cb922c4999808ef785edbead9ba3d5941eb19cb742eb6229df Apr 17 08:02:45.098053 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.098015 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:45.098188 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:45.098168 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:45.098230 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:45.098196 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:45.098230 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:45.098208 2583 projected.go:194] Error preparing data for projected volume kube-api-access-8bt5l for pod openshift-network-diagnostics/network-check-target-52zv7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:45.098288 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:45.098264 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l podName:d7bf9c6a-1777-45f6-9f27-2ee3d09959d1 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:46.098246354 +0000 UTC m=+4.185029031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8bt5l" (UniqueName: "kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l") pod "network-check-target-52zv7" (UID: "d7bf9c6a-1777-45f6-9f27-2ee3d09959d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:45.424131 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.423902 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:57:43 +0000 UTC" deadline="2027-10-17 13:31:46.915254356 +0000 UTC" Apr 17 08:02:45.424131 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.423947 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13157h29m1.491311568s" Apr 17 08:02:45.522343 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.522284 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"153c4992ce15c4cb922c4999808ef785edbead9ba3d5941eb19cb742eb6229df"} Apr 17 08:02:45.524539 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.524483 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" event={"ID":"e8a42681-cd32-4735-af7b-2c1ca0b2df67","Type":"ContainerStarted","Data":"1bc29b98e8af37722fb841690c8d0831ecb189ba5113dc0117675c1f9510193a"} Apr 17 08:02:45.531520 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.531481 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerStarted","Data":"024996e2215b55ef02393e145b0fc57913fefe299ce3cfc7ef1a0c1481e2174b"} Apr 17 08:02:45.533172 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.533123 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q5zfz" event={"ID":"ad1b5fde-cf28-4a54-bb33-cb43f425421e","Type":"ContainerStarted","Data":"1637e57c3d85e7e3b7beeb564d0c69eef370636c7eb157a358bb7938076f8703"} Apr 17 08:02:45.535206 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.535180 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pk77j" event={"ID":"5f3f7f47-4941-40cd-88d3-259605376e0e","Type":"ContainerStarted","Data":"657cbbfd8f196ac72a4b86380bf442a863b58d5ebd4db02b20e7bdecddf74196"} Apr 17 08:02:45.544988 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.544963 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" event={"ID":"db15b079396bd5a2463864d9d840da9c","Type":"ContainerStarted","Data":"812878fc4decbb3f6be912149aa897bd1a01c845f350ed880bfb0f72bce6f60e"} Apr 17 08:02:45.553931 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.553901 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lr452" event={"ID":"89794c28-aca4-44e3-899f-a486defd216d","Type":"ContainerStarted","Data":"f2b2b5ceefd8bce0bff52f102410d3e879545488b626a9e309b54fcc431abe0c"} Apr 17 08:02:45.557947 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.557894 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-54.ec2.internal" podStartSLOduration=2.557879061 podStartE2EDuration="2.557879061s" podCreationTimestamp="2026-04-17 08:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:02:45.557707916 +0000 UTC m=+3.644490602" watchObservedRunningTime="2026-04-17 08:02:45.557879061 +0000 UTC m=+3.644661740" Apr 17 08:02:45.559247 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.559219 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" event={"ID":"e0b1c45e-6f25-419a-b20e-afc106313813","Type":"ContainerStarted","Data":"6f18ee8849444723f6bc0a0d53715ee0fce31a3430679b10ff876bca880b8011"} Apr 17 08:02:45.565470 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.565444 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sptg6" event={"ID":"71e69b48-555a-432e-b12a-16dae2f11d3c","Type":"ContainerStarted","Data":"e699d589a6c08d287e1b25747b7ee84cfb86b081908221be202e17b5ee30f8b4"} Apr 17 08:02:45.568482 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:45.568442 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gn8ng" event={"ID":"d961e954-15d3-43c1-800e-ea0f8d3c806d","Type":"ContainerStarted","Data":"1bcf41e530baab554ed9c12019f178ee81b12482e15c5c6378a58d2d80ce065f"} Apr 17 08:02:46.004978 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:46.004935 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:46.005173 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.005134 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:46.005237 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.005202 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:48.005180497 +0000 UTC m=+6.091963163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:46.106064 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:46.105671 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:46.106064 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.106022 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:46.106064 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.106060 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:46.106292 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.106074 2583 projected.go:194] Error preparing data for projected volume kube-api-access-8bt5l for pod openshift-network-diagnostics/network-check-target-52zv7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:46.106292 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.106127 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l podName:d7bf9c6a-1777-45f6-9f27-2ee3d09959d1 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:48.106108769 +0000 UTC m=+6.192891447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8bt5l" (UniqueName: "kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l") pod "network-check-target-52zv7" (UID: "d7bf9c6a-1777-45f6-9f27-2ee3d09959d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:46.511989 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:46.511441 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:46.511989 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.511565 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:46.512821 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:46.512601 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:46.512821 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:46.512716 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:46.591153 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:46.591115 2583 generic.go:358] "Generic (PLEG): container finished" podID="51f9f233d352b5a7748f43e51cb52994" containerID="6b28fcc16537ead201641d86c9e4914c9280d000eb09eec2162b3e342aa98b36" exitCode=0 Apr 17 08:02:46.592212 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:46.592146 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" event={"ID":"51f9f233d352b5a7748f43e51cb52994","Type":"ContainerDied","Data":"6b28fcc16537ead201641d86c9e4914c9280d000eb09eec2162b3e342aa98b36"} Apr 17 08:02:47.604601 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:47.604559 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" event={"ID":"51f9f233d352b5a7748f43e51cb52994","Type":"ContainerStarted","Data":"8bea9fb5dccb2186625e1bd547022f3af64dec6b1981da814bfeb322953b3269"} Apr 17 08:02:48.024133 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:48.024098 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:48.024324 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.024247 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:48.024324 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.024321 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:52.024300229 +0000 UTC m=+10.111082898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:48.125486 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:48.125443 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:48.125638 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.125609 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:48.125638 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.125628 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:48.125733 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.125640 2583 projected.go:194] Error preparing data for projected volume kube-api-access-8bt5l for pod openshift-network-diagnostics/network-check-target-52zv7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:48.125733 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.125698 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l podName:d7bf9c6a-1777-45f6-9f27-2ee3d09959d1 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:52.125678323 +0000 UTC m=+10.212461001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8bt5l" (UniqueName: "kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l") pod "network-check-target-52zv7" (UID: "d7bf9c6a-1777-45f6-9f27-2ee3d09959d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:48.510666 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:48.510417 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:48.510666 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:48.510428 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:48.510666 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.510578 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:48.515073 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:48.511274 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:50.510679 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:50.510638 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:50.511157 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:50.510847 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:50.511309 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:50.511291 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:50.511433 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:50.511390 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:52.060565 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:52.060527 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:52.061116 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.060734 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:52.061116 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.060799 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:00.060779577 +0000 UTC m=+18.147562255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:52.161849 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:52.161678 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:52.161849 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.161838 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:52.161849 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.161858 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:52.162164 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.161871 2583 projected.go:194] Error preparing data for projected volume kube-api-access-8bt5l for pod openshift-network-diagnostics/network-check-target-52zv7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:52.162164 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.161934 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l podName:d7bf9c6a-1777-45f6-9f27-2ee3d09959d1 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:00.161915597 +0000 UTC m=+18.248698260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8bt5l" (UniqueName: "kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l") pod "network-check-target-52zv7" (UID: "d7bf9c6a-1777-45f6-9f27-2ee3d09959d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:52.510829 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:52.510779 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:52.511029 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.510916 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:52.511306 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:52.510782 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:52.511306 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:52.511247 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:54.510493 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:54.510456 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:54.511060 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:54.510467 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:54.511060 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:54.510639 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:54.511060 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:54.510730 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:56.510558 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:56.510524 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:56.511022 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:56.510643 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:56.511022 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:56.510525 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:56.511133 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:56.511063 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:02:58.510428 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:58.510390 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:02:58.510844 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:02:58.510437 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:02:58.510844 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:58.510540 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:02:58.510844 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:02:58.510671 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:00.118722 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:00.118682 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:00.119211 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.118832 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:03:00.119211 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.118922 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:16.118900249 +0000 UTC m=+34.205682916 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:03:00.219097 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:00.219057 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:00.219257 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.219204 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:03:00.219257 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.219222 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:03:00.219257 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.219233 2583 projected.go:194] Error preparing data for projected volume kube-api-access-8bt5l for pod openshift-network-diagnostics/network-check-target-52zv7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:03:00.219406 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.219297 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l podName:d7bf9c6a-1777-45f6-9f27-2ee3d09959d1 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:16.219278287 +0000 UTC m=+34.306060951 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8bt5l" (UniqueName: "kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l") pod "network-check-target-52zv7" (UID: "d7bf9c6a-1777-45f6-9f27-2ee3d09959d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:03:00.509804 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:00.509766 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:00.510089 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:00.509814 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:00.510089 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.509913 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:00.510089 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:00.510055 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:02.511201 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.510934 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:02.511201 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.511192 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:02.511823 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:02.511297 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:02.511823 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:02.511409 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:02.629599 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.629530 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:03:02.629854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.629834 2583 generic.go:358] "Generic (PLEG): container finished" podID="a9582e01-007d-4cf4-9287-1788560d38e1" containerID="706543860d3ad25d9f1f418ae66afc1752353bcd23527306f2f94f20d3626020" exitCode=1 Apr 17 08:03:02.629937 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.629906 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"7440376854e791696ec2dceccb3958f37cc24f82fb3e685400c2738deebb03ab"} Apr 17 08:03:02.630021 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.629935 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerDied","Data":"706543860d3ad25d9f1f418ae66afc1752353bcd23527306f2f94f20d3626020"} Apr 17 08:03:02.630021 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.629949 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"faa75e93dc37e91bd1a601ad87c90b0c5b0530e3f01ffa3b9b335f7ff11eeb4b"} Apr 17 08:03:02.631136 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.631114 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" event={"ID":"e8a42681-cd32-4735-af7b-2c1ca0b2df67","Type":"ContainerStarted","Data":"2b2662b9b2df0c76bae84a8ced4a22abb4cf73373a2e4d3b39df2512ba651493"} Apr 17 08:03:02.632242 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.632222 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a741811-1651-45b8-97c3-04e9be19b874" containerID="867cc66f9a8315f322364f9b1c2a36a1d0871ae516e52724df90b14c4f5af05f" exitCode=0 Apr 17 08:03:02.632322 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.632280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerDied","Data":"867cc66f9a8315f322364f9b1c2a36a1d0871ae516e52724df90b14c4f5af05f"} Apr 17 08:03:02.633572 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.633548 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q5zfz" event={"ID":"ad1b5fde-cf28-4a54-bb33-cb43f425421e","Type":"ContainerStarted","Data":"ae588372b03b4fe5b94cb4edffba0c9f5227e6605b23cff2f175989181cb9efe"} Apr 17 08:03:02.635246 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.635216 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pk77j" event={"ID":"5f3f7f47-4941-40cd-88d3-259605376e0e","Type":"ContainerStarted","Data":"2357ca5d2ad29a3dabc60d1780662ef9ff7838ac32d0d936df24fb93b2fc4dfc"} Apr 17 08:03:02.640272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.640245 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" event={"ID":"e0b1c45e-6f25-419a-b20e-afc106313813","Type":"ContainerStarted","Data":"4274df930ce7906fa155efb5db50c6ed1d74f817cae892e29fada58de3d8be3c"} Apr 17 08:03:02.641992 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.641965 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sptg6" event={"ID":"71e69b48-555a-432e-b12a-16dae2f11d3c","Type":"ContainerStarted","Data":"f10614e1986fbe435fe55bf0c7604b25f826671d7473fc6a891850247d46effe"} Apr 17 08:03:02.643192 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.643170 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gn8ng" event={"ID":"d961e954-15d3-43c1-800e-ea0f8d3c806d","Type":"ContainerStarted","Data":"b9f35f98a568ec9b374b34bd4ee3217fdefcb24cbebbe1962fbdb1b574b00777"} Apr 17 08:03:02.653770 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.653733 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-54.ec2.internal" podStartSLOduration=19.653722366 podStartE2EDuration="19.653722366s" podCreationTimestamp="2026-04-17 08:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:02:47.617186778 +0000 UTC m=+5.703969453" watchObservedRunningTime="2026-04-17 08:03:02.653722366 +0000 UTC m=+20.740505051" Apr 17 08:03:02.666879 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.666657 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gn8ng" podStartSLOduration=3.800491886 podStartE2EDuration="20.666643994s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.058766724 +0000 UTC m=+3.145549390" lastFinishedPulling="2026-04-17 08:03:01.92491882 +0000 UTC m=+20.011701498" observedRunningTime="2026-04-17 08:03:02.666519869 +0000 UTC m=+20.753302553" watchObservedRunningTime="2026-04-17 08:03:02.666643994 +0000 UTC m=+20.753426680" Apr 17 08:03:02.679475 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.679432 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nzxn9" podStartSLOduration=3.85225023 podStartE2EDuration="20.67941925s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.06516365 +0000 UTC m=+3.151946331" lastFinishedPulling="2026-04-17 08:03:01.892332683 +0000 UTC m=+19.979115351" observedRunningTime="2026-04-17 08:03:02.679244174 +0000 UTC m=+20.766026859" watchObservedRunningTime="2026-04-17 08:03:02.67941925 +0000 UTC m=+20.766201935" Apr 17 08:03:02.690246 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.690199 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pk77j" podStartSLOduration=3.854502525 podStartE2EDuration="20.690182008s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.056644718 +0000 UTC m=+3.143427387" lastFinishedPulling="2026-04-17 08:03:01.892324201 +0000 UTC m=+19.979106870" observedRunningTime="2026-04-17 08:03:02.690014832 +0000 UTC m=+20.776797516" watchObservedRunningTime="2026-04-17 08:03:02.690182008 +0000 UTC m=+20.776964694" Apr 17 08:03:02.702493 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.702445 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-sptg6" podStartSLOduration=3.871667472 podStartE2EDuration="20.702428894s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.061554705 +0000 UTC m=+3.148337373" lastFinishedPulling="2026-04-17 08:03:01.892316117 +0000 UTC m=+19.979098795" observedRunningTime="2026-04-17 08:03:02.702401993 +0000 UTC m=+20.789184679" watchObservedRunningTime="2026-04-17 08:03:02.702428894 +0000 UTC m=+20.789211578" Apr 17 08:03:02.721253 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:02.721210 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q5zfz" podStartSLOduration=3.891947047 podStartE2EDuration="20.721191464s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.06307868 +0000 UTC m=+3.149861354" lastFinishedPulling="2026-04-17 08:03:01.892323105 +0000 UTC m=+19.979105771" observedRunningTime="2026-04-17 08:03:02.720974037 +0000 UTC m=+20.807756722" watchObservedRunningTime="2026-04-17 08:03:02.721191464 +0000 UTC m=+20.807974152" Apr 17 08:03:03.503810 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.503768 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 08:03:03.648652 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.648620 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:03:03.649361 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.649055 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"81fd9e6b238b05776a887101829b67d948d50eb040cc85b28ac5b8b1e9b1a1fb"} Apr 17 08:03:03.649361 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.649093 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"1ddb9c7d3ea932c4e8281fbb8d9622a523000b209deb4a5277f3342d824fb79f"} Apr 17 08:03:03.649361 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.649104 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"a395b80670e99d7a7d545a07d0b12a020bae41bc567f5f012916a1e02a78bfb9"} Apr 17 08:03:03.650437 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.650406 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lr452" event={"ID":"89794c28-aca4-44e3-899f-a486defd216d","Type":"ContainerStarted","Data":"7f0fba4e6f4564f750f76a38f5c0a83c018234dd3622e6249eec779017212fdc"} Apr 17 08:03:03.652235 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.652198 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" event={"ID":"e0b1c45e-6f25-419a-b20e-afc106313813","Type":"ContainerStarted","Data":"86e53d6c712b67b3a399a77641fd5d0cf7d63eae4eb3b75b4f44fb765039ba31"} Apr 17 08:03:03.663820 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:03.663772 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lr452" podStartSLOduration=4.800553492 podStartE2EDuration="21.663754836s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.066797836 +0000 UTC m=+3.153580516" lastFinishedPulling="2026-04-17 08:03:01.929999186 +0000 UTC m=+20.016781860" observedRunningTime="2026-04-17 08:03:03.663150083 +0000 UTC m=+21.749932770" watchObservedRunningTime="2026-04-17 08:03:03.663754836 +0000 UTC m=+21.750537524" Apr 17 08:03:04.445091 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:04.444970 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T08:03:03.503788177Z","UUID":"3886c547-2edd-4aab-af89-9b3074ef9190","Handler":null,"Name":"","Endpoint":""} Apr 17 08:03:04.446697 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:04.446663 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 08:03:04.446697 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:04.446699 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 08:03:04.510596 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:04.510561 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:04.510757 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:04.510708 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:04.510832 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:04.510776 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:04.510884 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:04.510868 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:05.658327 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:05.658300 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:03:05.658929 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:05.658735 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"150e4cebc07c90b8b9f7249de8aa50ee5333e5d9458bd24f73763f58aee5b544"} Apr 17 08:03:05.660818 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:05.660791 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" event={"ID":"e0b1c45e-6f25-419a-b20e-afc106313813","Type":"ContainerStarted","Data":"2bb9fb36561e5afc7ad5b619160b6b0967073eec1a1401fd7bb8559b45dba5d5"} Apr 17 08:03:05.675694 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:05.675639 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-56fr6" podStartSLOduration=4.175311169 podStartE2EDuration="23.67562136s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.062435698 +0000 UTC m=+3.149218369" lastFinishedPulling="2026-04-17 08:03:04.562745893 +0000 UTC m=+22.649528560" observedRunningTime="2026-04-17 08:03:05.675301379 +0000 UTC m=+23.762084064" watchObservedRunningTime="2026-04-17 08:03:05.67562136 +0000 UTC m=+23.762404047" Apr 17 08:03:06.510466 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:06.510429 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:06.510628 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:06.510429 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:06.510628 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:06.510556 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:06.510721 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:06.510656 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:06.997848 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:06.997812 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:03:06.998644 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:06.998629 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:03:07.667617 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.667455 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:03:07.667969 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.667948 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"eb9b821839498dd8499b47c50cb6a774b2a80cf177add412e67a578b9a2cb114"} Apr 17 08:03:07.668304 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.668276 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:03:07.668449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.668315 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:03:07.668576 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.668460 2583 scope.go:117] "RemoveContainer" containerID="706543860d3ad25d9f1f418ae66afc1752353bcd23527306f2f94f20d3626020" Apr 17 08:03:07.669803 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.669774 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a741811-1651-45b8-97c3-04e9be19b874" containerID="52ceb57ca5795eea3f660c5cb26c25f6b6378f019c6d236f9cd82d43b3d3c2d9" exitCode=0 Apr 17 08:03:07.669924 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.669859 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerDied","Data":"52ceb57ca5795eea3f660c5cb26c25f6b6378f019c6d236f9cd82d43b3d3c2d9"} Apr 17 08:03:07.683779 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:07.683756 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:03:08.509718 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.509685 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:08.510225 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.509685 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:08.510225 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:08.509819 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:08.510225 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:08.509857 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:08.674163 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.674140 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:03:08.674458 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.674430 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" event={"ID":"a9582e01-007d-4cf4-9287-1788560d38e1","Type":"ContainerStarted","Data":"13b664197b929a3b90fa6db316e9a6ad54c133378b8d94b4f8fc90d892f41100"} Apr 17 08:03:08.674674 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.674656 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:03:08.688873 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.688841 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:03:08.698362 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.698321 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" podStartSLOduration=9.542230803 podStartE2EDuration="26.698306322s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.067619857 +0000 UTC m=+3.154402522" lastFinishedPulling="2026-04-17 08:03:02.223695362 +0000 UTC m=+20.310478041" observedRunningTime="2026-04-17 08:03:08.697030742 +0000 UTC m=+26.783813426" watchObservedRunningTime="2026-04-17 08:03:08.698306322 +0000 UTC m=+26.785089006" Apr 17 08:03:08.721748 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.721721 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:03:08.721879 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.721864 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 08:03:08.722285 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:08.722259 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-sptg6" Apr 17 08:03:09.543982 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:09.543823 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5zj7l"] Apr 17 08:03:09.544518 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:09.544070 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:09.544518 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:09.544161 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:09.554361 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:09.554331 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-52zv7"] Apr 17 08:03:09.554473 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:09.554454 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:09.554580 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:09.554559 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:09.681459 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:09.681428 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a741811-1651-45b8-97c3-04e9be19b874" containerID="c8376e46372a4b32f9f36a1f409215638b9b0d0986d72b16c6fc23cd2415f464" exitCode=0 Apr 17 08:03:09.681634 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:09.681501 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerDied","Data":"c8376e46372a4b32f9f36a1f409215638b9b0d0986d72b16c6fc23cd2415f464"} Apr 17 08:03:11.509719 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:11.509638 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:11.510334 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:11.509641 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:11.510334 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:11.509767 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:11.510334 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:11.509820 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:11.687749 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:11.687720 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a741811-1651-45b8-97c3-04e9be19b874" containerID="315212fe5bdff6406978d030794feff2e233c9e9bde0bd1255b1e6c33cf24d3e" exitCode=0 Apr 17 08:03:11.687924 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:11.687775 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerDied","Data":"315212fe5bdff6406978d030794feff2e233c9e9bde0bd1255b1e6c33cf24d3e"} Apr 17 08:03:13.510272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:13.510238 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:13.510775 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:13.510243 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:13.510775 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:13.510375 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52zv7" podUID="d7bf9c6a-1777-45f6-9f27-2ee3d09959d1" Apr 17 08:03:13.510775 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:13.510549 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5zj7l" podUID="a358126d-f138-41e0-b8a2-598652e544f5" Apr 17 08:03:15.290929 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.290733 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-54.ec2.internal" event="NodeReady" Apr 17 08:03:15.291465 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.291074 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 08:03:15.332630 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.332596 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-65v7n"] Apr 17 08:03:15.348300 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.348257 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m7zgj"] Apr 17 08:03:15.348468 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.348425 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.350855 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.350828 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wdhxs\"" Apr 17 08:03:15.350989 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.350935 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 08:03:15.351309 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.351196 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 08:03:15.364211 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.364191 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65v7n"] Apr 17 08:03:15.364211 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.364214 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m7zgj"] Apr 17 08:03:15.364397 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.364323 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:15.366793 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.366772 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 08:03:15.366906 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.366830 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x9bwm\"" Apr 17 08:03:15.366971 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.366830 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 08:03:15.366971 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.366961 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 08:03:15.431698 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.431661 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.431698 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.431708 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-tmp-dir\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.431966 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.431792 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-config-volume\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.431966 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.431815 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv6tw\" (UniqueName: \"kubernetes.io/projected/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-kube-api-access-lv6tw\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.510247 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.510213 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:15.510454 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.510409 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:15.512548 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.512520 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 08:03:15.512548 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.512542 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x2n8l\"" Apr 17 08:03:15.512742 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.512722 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 08:03:15.512871 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.512787 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nfrwk\"" Apr 17 08:03:15.512976 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.512956 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 08:03:15.532199 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532170 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-config-volume\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.532358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532211 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv6tw\" (UniqueName: \"kubernetes.io/projected/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-kube-api-access-lv6tw\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.532358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.532358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532271 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-tmp-dir\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.532358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532297 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:15.532358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532328 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ntg6\" (UniqueName: \"kubernetes.io/projected/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-kube-api-access-6ntg6\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:15.532594 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:15.532415 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:15.532594 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:15.532495 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:16.032472197 +0000 UTC m=+34.119254879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:15.532692 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532662 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-tmp-dir\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.532825 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.532802 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-config-volume\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.542125 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.542024 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv6tw\" (UniqueName: \"kubernetes.io/projected/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-kube-api-access-lv6tw\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:15.633212 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.633180 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:15.633212 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.633225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ntg6\" (UniqueName: \"kubernetes.io/projected/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-kube-api-access-6ntg6\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:15.633482 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:15.633417 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:15.633556 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:15.633499 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:03:16.133477338 +0000 UTC m=+34.220260026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:15.641799 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:15.641764 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ntg6\" (UniqueName: \"kubernetes.io/projected/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-kube-api-access-6ntg6\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:16.036302 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:16.036263 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:16.036574 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:16.036419 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:16.036574 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:16.036500 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:17.036479653 +0000 UTC m=+35.123262316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:16.137209 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:16.137168 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:16.137458 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:16.137343 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 08:03:16.137458 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:16.137359 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:16.137458 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:16.137422 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:48.137401535 +0000 UTC m=+66.224184205 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : secret "metrics-daemon-secret" not found Apr 17 08:03:16.137458 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:16.137436 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:16.137669 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:16.137484 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:03:17.137468089 +0000 UTC m=+35.224250754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:16.238296 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:16.238250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:16.249027 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:16.249002 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bt5l\" (UniqueName: \"kubernetes.io/projected/d7bf9c6a-1777-45f6-9f27-2ee3d09959d1-kube-api-access-8bt5l\") pod \"network-check-target-52zv7\" (UID: \"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1\") " pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:16.421542 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:16.421501 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:17.044500 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:17.044459 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:17.044676 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:17.044638 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:17.044727 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:17.044720 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:19.044696294 +0000 UTC m=+37.131478970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:17.145277 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:17.145236 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:17.145461 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:17.145411 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:17.145526 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:17.145491 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:03:19.145469782 +0000 UTC m=+37.232252470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:17.494389 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:17.494355 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-52zv7"] Apr 17 08:03:17.498735 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:03:17.498707 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7bf9c6a_1777_45f6_9f27_2ee3d09959d1.slice/crio-ccdb64aec9023245e1e2e779af2b4348545701fb38803e161681842ff25cb87f WatchSource:0}: Error finding container ccdb64aec9023245e1e2e779af2b4348545701fb38803e161681842ff25cb87f: Status 404 returned error can't find the container with id ccdb64aec9023245e1e2e779af2b4348545701fb38803e161681842ff25cb87f Apr 17 08:03:17.700555 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:17.700525 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-52zv7" event={"ID":"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1","Type":"ContainerStarted","Data":"ccdb64aec9023245e1e2e779af2b4348545701fb38803e161681842ff25cb87f"} Apr 17 08:03:18.705018 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:18.704844 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a741811-1651-45b8-97c3-04e9be19b874" containerID="1ca83159dbde86cf48115bcde32a7f35a3b3ef67074a34c079fa6a96149407f9" exitCode=0 Apr 17 08:03:18.705018 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:18.704921 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerDied","Data":"1ca83159dbde86cf48115bcde32a7f35a3b3ef67074a34c079fa6a96149407f9"} Apr 17 08:03:19.058848 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:19.058755 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:19.059002 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:19.058915 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:19.059002 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:19.058991 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:23.058972253 +0000 UTC m=+41.145754916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:19.160015 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:19.159981 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:19.160191 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:19.160151 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:19.160258 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:19.160229 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:03:23.160205599 +0000 UTC m=+41.246988264 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:19.709121 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:19.709088 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a741811-1651-45b8-97c3-04e9be19b874" containerID="f288bcf775d60ef3a06e5ea0ed1d6c2288f87005ce1590504b8325a639970dc8" exitCode=0 Apr 17 08:03:19.709772 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:19.709139 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerDied","Data":"f288bcf775d60ef3a06e5ea0ed1d6c2288f87005ce1590504b8325a639970dc8"} Apr 17 08:03:20.715168 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:20.715110 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s742h" event={"ID":"3a741811-1651-45b8-97c3-04e9be19b874","Type":"ContainerStarted","Data":"7f54efc37555194cc48d02b83dd474571bba129c04a1c568fd9a2181645b0c6e"} Apr 17 08:03:20.734918 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:20.734870 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s742h" podStartSLOduration=6.123216953 podStartE2EDuration="38.734852113s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:02:45.06478016 +0000 UTC m=+3.151562834" lastFinishedPulling="2026-04-17 08:03:17.676415315 +0000 UTC m=+35.763197994" observedRunningTime="2026-04-17 08:03:20.733691077 +0000 UTC m=+38.820473764" watchObservedRunningTime="2026-04-17 08:03:20.734852113 +0000 UTC m=+38.821634798" Apr 17 08:03:21.718179 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:21.718143 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-52zv7" event={"ID":"d7bf9c6a-1777-45f6-9f27-2ee3d09959d1","Type":"ContainerStarted","Data":"6cf8b6df14d9deaa9ffd311b16700bd2d7a9faadd70dee4c111631e82d5e7887"} Apr 17 08:03:21.718558 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:21.718293 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:21.731678 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:21.731631 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-52zv7" podStartSLOduration=36.206844316 podStartE2EDuration="39.731614836s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:03:17.500548939 +0000 UTC m=+35.587331606" lastFinishedPulling="2026-04-17 08:03:21.025319459 +0000 UTC m=+39.112102126" observedRunningTime="2026-04-17 08:03:21.730982999 +0000 UTC m=+39.817765696" watchObservedRunningTime="2026-04-17 08:03:21.731614836 +0000 UTC m=+39.818397523" Apr 17 08:03:23.088263 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:23.088227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:23.088642 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:23.088375 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:23.088642 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:23.088443 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:31.088426373 +0000 UTC m=+49.175209037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:23.189084 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:23.189015 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:23.189265 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:23.189163 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:23.189265 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:23.189229 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:03:31.189214451 +0000 UTC m=+49.275997114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:31.145301 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:31.145264 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:31.145750 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:31.145387 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:31.145750 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:31.145439 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:47.145426296 +0000 UTC m=+65.232208959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:31.245644 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:31.245601 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:31.245794 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:31.245746 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:31.245831 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:31.245815 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:03:47.245798392 +0000 UTC m=+65.332581071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:40.701468 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:40.701439 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qrq7z" Apr 17 08:03:47.158418 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:47.158362 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:03:47.158881 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:47.158516 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:47.158881 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:47.158585 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:19.158568456 +0000 UTC m=+97.245351124 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:03:47.259630 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:47.259594 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:03:47.259812 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:47.259763 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:47.259879 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:47.259839 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:04:19.259818193 +0000 UTC m=+97.346600866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:03:48.164542 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:48.164494 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:03:48.164933 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:48.164642 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 08:03:48.164933 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:03:48.164699 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs podName:a358126d-f138-41e0-b8a2-598652e544f5 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:52.164684246 +0000 UTC m=+130.251466913 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs") pod "network-metrics-daemon-5zj7l" (UID: "a358126d-f138-41e0-b8a2-598652e544f5") : secret "metrics-daemon-secret" not found Apr 17 08:03:52.722565 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:52.722537 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-52zv7" Apr 17 08:03:53.127939 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.127847 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j"] Apr 17 08:03:53.133994 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.133963 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.135924 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.135900 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 08:03:53.136414 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.136394 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-5td8h\"" Apr 17 08:03:53.136414 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.136408 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 08:03:53.136500 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.136392 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 08:03:53.136828 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.136814 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 08:03:53.142055 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.142019 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j"] Apr 17 08:03:53.198431 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.198387 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22b5l\" (UniqueName: \"kubernetes.io/projected/660d5a46-3175-42d1-822d-42b88105938b-kube-api-access-22b5l\") pod \"managed-serviceaccount-addon-agent-65fc6b848c-8rl4j\" (UID: \"660d5a46-3175-42d1-822d-42b88105938b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.198605 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.198465 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/660d5a46-3175-42d1-822d-42b88105938b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65fc6b848c-8rl4j\" (UID: \"660d5a46-3175-42d1-822d-42b88105938b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.298888 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.298852 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22b5l\" (UniqueName: \"kubernetes.io/projected/660d5a46-3175-42d1-822d-42b88105938b-kube-api-access-22b5l\") pod \"managed-serviceaccount-addon-agent-65fc6b848c-8rl4j\" (UID: \"660d5a46-3175-42d1-822d-42b88105938b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.299069 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.298905 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/660d5a46-3175-42d1-822d-42b88105938b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65fc6b848c-8rl4j\" (UID: \"660d5a46-3175-42d1-822d-42b88105938b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.302569 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.302548 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/660d5a46-3175-42d1-822d-42b88105938b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65fc6b848c-8rl4j\" (UID: \"660d5a46-3175-42d1-822d-42b88105938b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.307346 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.307321 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22b5l\" (UniqueName: \"kubernetes.io/projected/660d5a46-3175-42d1-822d-42b88105938b-kube-api-access-22b5l\") pod \"managed-serviceaccount-addon-agent-65fc6b848c-8rl4j\" (UID: \"660d5a46-3175-42d1-822d-42b88105938b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.456790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.456763 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" Apr 17 08:03:53.588907 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.588866 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j"] Apr 17 08:03:53.594073 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:03:53.594027 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660d5a46_3175_42d1_822d_42b88105938b.slice/crio-146c771775cba1e3c8fdbd0c01596197ba5f9ded45e05cc35b2dc3b756bd63e1 WatchSource:0}: Error finding container 146c771775cba1e3c8fdbd0c01596197ba5f9ded45e05cc35b2dc3b756bd63e1: Status 404 returned error can't find the container with id 146c771775cba1e3c8fdbd0c01596197ba5f9ded45e05cc35b2dc3b756bd63e1 Apr 17 08:03:53.777793 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:53.777706 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" event={"ID":"660d5a46-3175-42d1-822d-42b88105938b","Type":"ContainerStarted","Data":"146c771775cba1e3c8fdbd0c01596197ba5f9ded45e05cc35b2dc3b756bd63e1"} Apr 17 08:03:56.785697 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:56.785657 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" event={"ID":"660d5a46-3175-42d1-822d-42b88105938b","Type":"ContainerStarted","Data":"3cbf6d58eeb3aa2790705fa1c08ba2b6d90c02bf1d87011d2a403a634ff20d79"} Apr 17 08:03:56.800880 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:03:56.800832 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65fc6b848c-8rl4j" podStartSLOduration=1.302206226 podStartE2EDuration="3.800816335s" podCreationTimestamp="2026-04-17 08:03:53 +0000 UTC" firstStartedPulling="2026-04-17 08:03:53.596440354 +0000 UTC m=+71.683223017" lastFinishedPulling="2026-04-17 08:03:56.095050457 +0000 UTC m=+74.181833126" observedRunningTime="2026-04-17 08:03:56.799893515 +0000 UTC m=+74.886676200" watchObservedRunningTime="2026-04-17 08:03:56.800816335 +0000 UTC m=+74.887599020" Apr 17 08:04:19.182165 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:19.182116 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:04:19.182492 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:19.182258 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:04:19.182492 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:19.182313 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls podName:d198e1b7-2f54-4936-a0be-c6a8e9e20ea7 nodeName:}" failed. No retries permitted until 2026-04-17 08:05:23.182298687 +0000 UTC m=+161.269081350 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls") pod "dns-default-65v7n" (UID: "d198e1b7-2f54-4936-a0be-c6a8e9e20ea7") : secret "dns-default-metrics-tls" not found Apr 17 08:04:19.283538 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:19.283440 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:04:19.283693 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:19.283565 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:04:19.283693 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:19.283627 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert podName:b4ebbaa6-1054-4eaf-87ba-d84dc8af620f nodeName:}" failed. No retries permitted until 2026-04-17 08:05:23.28361254 +0000 UTC m=+161.370395203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert") pod "ingress-canary-m7zgj" (UID: "b4ebbaa6-1054-4eaf-87ba-d84dc8af620f") : secret "canary-serving-cert" not found Apr 17 08:04:20.205252 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.205213 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7c6d4f57f6-pj78j"] Apr 17 08:04:20.209644 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.209621 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.211556 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.211532 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 08:04:20.211670 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.211534 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 08:04:20.211718 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.211701 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 08:04:20.211754 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.211699 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 08:04:20.211754 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.211747 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lxgpx\"" Apr 17 08:04:20.212102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.212082 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 08:04:20.212102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.212094 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 08:04:20.218929 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.218908 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7c6d4f57f6-pj78j"] Apr 17 08:04:20.290674 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.290636 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.290674 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.290679 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-default-certificate\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.290905 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.290698 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.290905 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.290741 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/06a60109-bb64-4cd2-9f4e-2987a8942aad-kube-api-access-vhhfl\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.290905 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.290789 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-stats-auth\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.391373 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.391320 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.391373 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.391372 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-default-certificate\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.391373 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.391393 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.391695 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.391421 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/06a60109-bb64-4cd2-9f4e-2987a8942aad-kube-api-access-vhhfl\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.391695 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:20.391504 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 08:04:20.391695 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:20.391520 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:20.891501533 +0000 UTC m=+98.978284234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : configmap references non-existent config key: service-ca.crt Apr 17 08:04:20.391695 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:20.391575 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:20.891558167 +0000 UTC m=+98.978340854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : secret "router-metrics-certs-default" not found Apr 17 08:04:20.391695 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.391636 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-stats-auth\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.393868 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.393842 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-default-certificate\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.393963 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.393877 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-stats-auth\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.399122 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.399103 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhhfl\" (UniqueName: \"kubernetes.io/projected/06a60109-bb64-4cd2-9f4e-2987a8942aad-kube-api-access-vhhfl\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.895687 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.895637 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.895687 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:20.895690 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:20.895910 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:20.895789 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 08:04:20.895910 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:20.895814 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:21.895794264 +0000 UTC m=+99.982576936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : configmap references non-existent config key: service-ca.crt Apr 17 08:04:20.895910 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:20.895837 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:21.895826427 +0000 UTC m=+99.982609093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : secret "router-metrics-certs-default" not found Apr 17 08:04:21.902918 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:21.902883 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:21.902918 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:21.902921 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:21.903345 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:21.903026 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 08:04:21.903345 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:21.903106 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:23.903085208 +0000 UTC m=+101.989867876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : secret "router-metrics-certs-default" not found Apr 17 08:04:21.903345 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:21.903121 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:23.903115 +0000 UTC m=+101.989897664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : configmap references non-existent config key: service-ca.crt Apr 17 08:04:23.916709 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:23.916658 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:23.916709 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:23.916708 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:23.917225 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:23.916818 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:27.916799552 +0000 UTC m=+106.003582216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : configmap references non-existent config key: service-ca.crt Apr 17 08:04:23.917225 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:23.916822 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 08:04:23.917225 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:23.916856 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:27.916849645 +0000 UTC m=+106.003632308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : secret "router-metrics-certs-default" not found Apr 17 08:04:26.665085 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:26.665031 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q5zfz_ad1b5fde-cf28-4a54-bb33-cb43f425421e/dns-node-resolver/0.log" Apr 17 08:04:27.465490 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:27.465465 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pk77j_5f3f7f47-4941-40cd-88d3-259605376e0e/node-ca/0.log" Apr 17 08:04:27.945326 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:27.945293 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:27.945326 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:27.945330 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:27.945717 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:27.945413 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 08:04:27.945717 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:27.945461 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:35.945442889 +0000 UTC m=+114.032225557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : configmap references non-existent config key: service-ca.crt Apr 17 08:04:27.945717 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:27.945484 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:35.945476919 +0000 UTC m=+114.032259581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : secret "router-metrics-certs-default" not found Apr 17 08:04:30.308965 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.308921 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf"] Apr 17 08:04:30.314540 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.314518 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mnhd4"] Apr 17 08:04:30.314677 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.314664 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.317672 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.317649 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 08:04:30.318082 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.317694 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 08:04:30.318082 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.317714 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-trtz6\"" Apr 17 08:04:30.318082 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.317670 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:04:30.318082 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.317971 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 08:04:30.318356 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.318340 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.320028 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.320008 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 08:04:30.320228 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.320211 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 08:04:30.320312 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.320250 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vc7pb\"" Apr 17 08:04:30.320371 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.320343 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 08:04:30.320459 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.320443 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:04:30.328836 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.328812 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 08:04:30.331522 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.331496 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf"] Apr 17 08:04:30.332374 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.332352 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mnhd4"] Apr 17 08:04:30.362538 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362506 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqqc\" (UniqueName: \"kubernetes.io/projected/2271c85d-959b-40a9-aaca-4f7851b44b73-kube-api-access-zlqqc\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.362689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362541 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7881d774-cc8e-4aa7-852b-0ef081cb8318-serving-cert\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.362689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362567 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxc5c\" (UniqueName: \"kubernetes.io/projected/7881d774-cc8e-4aa7-852b-0ef081cb8318-kube-api-access-kxc5c\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.362689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362623 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2271c85d-959b-40a9-aaca-4f7851b44b73-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.362689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7881d774-cc8e-4aa7-852b-0ef081cb8318-trusted-ca\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.362842 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362708 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2271c85d-959b-40a9-aaca-4f7851b44b73-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.362842 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.362734 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7881d774-cc8e-4aa7-852b-0ef081cb8318-config\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.463057 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463003 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqqc\" (UniqueName: \"kubernetes.io/projected/2271c85d-959b-40a9-aaca-4f7851b44b73-kube-api-access-zlqqc\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.463057 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463053 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7881d774-cc8e-4aa7-852b-0ef081cb8318-serving-cert\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.463310 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463072 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxc5c\" (UniqueName: \"kubernetes.io/projected/7881d774-cc8e-4aa7-852b-0ef081cb8318-kube-api-access-kxc5c\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.463310 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463099 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2271c85d-959b-40a9-aaca-4f7851b44b73-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.463310 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7881d774-cc8e-4aa7-852b-0ef081cb8318-trusted-ca\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.463310 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463221 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2271c85d-959b-40a9-aaca-4f7851b44b73-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.463310 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7881d774-cc8e-4aa7-852b-0ef081cb8318-config\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.463786 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463758 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2271c85d-959b-40a9-aaca-4f7851b44b73-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.463944 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463928 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7881d774-cc8e-4aa7-852b-0ef081cb8318-config\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.464026 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.463979 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7881d774-cc8e-4aa7-852b-0ef081cb8318-trusted-ca\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.465393 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.465365 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2271c85d-959b-40a9-aaca-4f7851b44b73-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.465581 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.465562 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7881d774-cc8e-4aa7-852b-0ef081cb8318-serving-cert\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.473268 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.473244 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqqc\" (UniqueName: \"kubernetes.io/projected/2271c85d-959b-40a9-aaca-4f7851b44b73-kube-api-access-zlqqc\") pod \"kube-storage-version-migrator-operator-6769c5d45-fx7kf\" (UID: \"2271c85d-959b-40a9-aaca-4f7851b44b73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.473268 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.473262 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxc5c\" (UniqueName: \"kubernetes.io/projected/7881d774-cc8e-4aa7-852b-0ef081cb8318-kube-api-access-kxc5c\") pod \"console-operator-9d4b6777b-mnhd4\" (UID: \"7881d774-cc8e-4aa7-852b-0ef081cb8318\") " pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.628086 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.627989 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" Apr 17 08:04:30.633735 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.633708 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:30.746283 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.746252 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf"] Apr 17 08:04:30.749737 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:30.749710 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2271c85d_959b_40a9_aaca_4f7851b44b73.slice/crio-7f5766453682298e35de9f925a5633812050189a72b2b1fc787e741c6895877d WatchSource:0}: Error finding container 7f5766453682298e35de9f925a5633812050189a72b2b1fc787e741c6895877d: Status 404 returned error can't find the container with id 7f5766453682298e35de9f925a5633812050189a72b2b1fc787e741c6895877d Apr 17 08:04:30.762031 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.762008 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mnhd4"] Apr 17 08:04:30.773238 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:30.773216 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7881d774_cc8e_4aa7_852b_0ef081cb8318.slice/crio-57f37ba615a7b29f3bc3bc578e719844844533ff3c15d07c4029d44dcacea234 WatchSource:0}: Error finding container 57f37ba615a7b29f3bc3bc578e719844844533ff3c15d07c4029d44dcacea234: Status 404 returned error can't find the container with id 57f37ba615a7b29f3bc3bc578e719844844533ff3c15d07c4029d44dcacea234 Apr 17 08:04:30.850330 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.850287 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" event={"ID":"7881d774-cc8e-4aa7-852b-0ef081cb8318","Type":"ContainerStarted","Data":"57f37ba615a7b29f3bc3bc578e719844844533ff3c15d07c4029d44dcacea234"} Apr 17 08:04:30.851132 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:30.851105 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" event={"ID":"2271c85d-959b-40a9-aaca-4f7851b44b73","Type":"ContainerStarted","Data":"7f5766453682298e35de9f925a5633812050189a72b2b1fc787e741c6895877d"} Apr 17 08:04:32.856732 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:32.856700 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" event={"ID":"7881d774-cc8e-4aa7-852b-0ef081cb8318","Type":"ContainerStarted","Data":"0bb29cd916cbd3f1628fbdefea342b42f0235c3a17e792a26a2b23da63c70931"} Apr 17 08:04:32.857153 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:32.856926 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:32.858174 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:32.858148 2583 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-mnhd4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" start-of-body= Apr 17 08:04:32.858294 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:32.858199 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" Apr 17 08:04:32.870840 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:32.870795 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podStartSLOduration=0.910503412 podStartE2EDuration="2.870781546s" podCreationTimestamp="2026-04-17 08:04:30 +0000 UTC" firstStartedPulling="2026-04-17 08:04:30.774855049 +0000 UTC m=+108.861637715" lastFinishedPulling="2026-04-17 08:04:32.73513317 +0000 UTC m=+110.821915849" observedRunningTime="2026-04-17 08:04:32.870002714 +0000 UTC m=+110.956785399" watchObservedRunningTime="2026-04-17 08:04:32.870781546 +0000 UTC m=+110.957564230" Apr 17 08:04:33.860016 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:33.859986 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/0.log" Apr 17 08:04:33.860547 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:33.860024 2583 generic.go:358] "Generic (PLEG): container finished" podID="7881d774-cc8e-4aa7-852b-0ef081cb8318" containerID="0bb29cd916cbd3f1628fbdefea342b42f0235c3a17e792a26a2b23da63c70931" exitCode=255 Apr 17 08:04:33.860547 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:33.860072 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" event={"ID":"7881d774-cc8e-4aa7-852b-0ef081cb8318","Type":"ContainerDied","Data":"0bb29cd916cbd3f1628fbdefea342b42f0235c3a17e792a26a2b23da63c70931"} Apr 17 08:04:33.860547 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:33.860404 2583 scope.go:117] "RemoveContainer" containerID="0bb29cd916cbd3f1628fbdefea342b42f0235c3a17e792a26a2b23da63c70931" Apr 17 08:04:33.861445 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:33.861422 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" event={"ID":"2271c85d-959b-40a9-aaca-4f7851b44b73","Type":"ContainerStarted","Data":"3ebae85485bb665654de44527998d15b02c413eded06c807a0e9cfca6fc3e1b8"} Apr 17 08:04:33.886396 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:33.886354 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" podStartSLOduration=1.283000326 podStartE2EDuration="3.886339456s" podCreationTimestamp="2026-04-17 08:04:30 +0000 UTC" firstStartedPulling="2026-04-17 08:04:30.751741409 +0000 UTC m=+108.838524075" lastFinishedPulling="2026-04-17 08:04:33.355080525 +0000 UTC m=+111.441863205" observedRunningTime="2026-04-17 08:04:33.885380144 +0000 UTC m=+111.972162825" watchObservedRunningTime="2026-04-17 08:04:33.886339456 +0000 UTC m=+111.973122151" Apr 17 08:04:34.617394 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.617357 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-78fcbc987f-fgm4w"] Apr 17 08:04:34.620272 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.620255 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.621989 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.621968 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 08:04:34.622136 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.622115 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 08:04:34.622217 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.622202 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5m5pl\"" Apr 17 08:04:34.622397 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.622381 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 08:04:34.626672 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.626596 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 08:04:34.632034 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.632012 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78fcbc987f-fgm4w"] Apr 17 08:04:34.694330 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.694294 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-trusted-ca\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.694574 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.694556 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-installation-pull-secrets\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.694695 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.694681 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545sm\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-kube-api-access-545sm\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.694860 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.694845 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.694964 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.694951 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86dddb50-317b-40e7-897f-3215f2c8b269-ca-trust-extracted\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.695117 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.695102 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-registry-certificates\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.695246 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.695229 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-image-registry-private-configuration\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.695338 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.695325 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-bound-sa-token\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796295 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796254 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-image-registry-private-configuration\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796295 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796297 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-bound-sa-token\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796526 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796353 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-trusted-ca\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796526 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796376 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-installation-pull-secrets\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796526 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796401 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-545sm\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-kube-api-access-545sm\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796526 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796474 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796526 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796502 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86dddb50-317b-40e7-897f-3215f2c8b269-ca-trust-extracted\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796797 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796545 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-registry-certificates\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.796871 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:34.796849 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:04:34.796934 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:34.796875 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78fcbc987f-fgm4w: secret "image-registry-tls" not found Apr 17 08:04:34.796934 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.796894 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86dddb50-317b-40e7-897f-3215f2c8b269-ca-trust-extracted\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.797066 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:34.796971 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls podName:86dddb50-317b-40e7-897f-3215f2c8b269 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:35.29695185 +0000 UTC m=+113.383734722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls") pod "image-registry-78fcbc987f-fgm4w" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269") : secret "image-registry-tls" not found Apr 17 08:04:34.797190 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.797163 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-registry-certificates\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.798057 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.798021 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-trusted-ca\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.799300 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.799274 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-image-registry-private-configuration\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.799412 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.799361 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-installation-pull-secrets\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.808443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.808420 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-bound-sa-token\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.808544 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.808448 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-545sm\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-kube-api-access-545sm\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:34.864991 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.864964 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/1.log" Apr 17 08:04:34.865450 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.865365 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/0.log" Apr 17 08:04:34.865450 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.865407 2583 generic.go:358] "Generic (PLEG): container finished" podID="7881d774-cc8e-4aa7-852b-0ef081cb8318" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" exitCode=255 Apr 17 08:04:34.865553 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.865494 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" event={"ID":"7881d774-cc8e-4aa7-852b-0ef081cb8318","Type":"ContainerDied","Data":"6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e"} Apr 17 08:04:34.865553 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.865533 2583 scope.go:117] "RemoveContainer" containerID="0bb29cd916cbd3f1628fbdefea342b42f0235c3a17e792a26a2b23da63c70931" Apr 17 08:04:34.865774 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:34.865758 2583 scope.go:117] "RemoveContainer" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" Apr 17 08:04:34.865989 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:34.865970 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:04:35.301980 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:35.301945 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:35.302177 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:35.302108 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:04:35.302177 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:35.302127 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78fcbc987f-fgm4w: secret "image-registry-tls" not found Apr 17 08:04:35.302269 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:35.302181 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls podName:86dddb50-317b-40e7-897f-3215f2c8b269 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:36.30216703 +0000 UTC m=+114.388949692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls") pod "image-registry-78fcbc987f-fgm4w" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269") : secret "image-registry-tls" not found Apr 17 08:04:35.868258 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:35.868231 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/1.log" Apr 17 08:04:35.868686 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:35.868545 2583 scope.go:117] "RemoveContainer" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" Apr 17 08:04:35.868743 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:35.868721 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:04:36.006937 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:36.006888 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:36.007162 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:36.006946 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:36.007162 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:36.007016 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:52.006995957 +0000 UTC m=+130.093778643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : configmap references non-existent config key: service-ca.crt Apr 17 08:04:36.007162 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:36.007068 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 08:04:36.007162 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:36.007114 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs podName:06a60109-bb64-4cd2-9f4e-2987a8942aad nodeName:}" failed. No retries permitted until 2026-04-17 08:04:52.007100387 +0000 UTC m=+130.093883075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs") pod "router-default-7c6d4f57f6-pj78j" (UID: "06a60109-bb64-4cd2-9f4e-2987a8942aad") : secret "router-metrics-certs-default" not found Apr 17 08:04:36.309227 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:36.309176 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:36.309442 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:36.309346 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:04:36.309442 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:36.309368 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78fcbc987f-fgm4w: secret "image-registry-tls" not found Apr 17 08:04:36.309545 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:36.309450 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls podName:86dddb50-317b-40e7-897f-3215f2c8b269 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:38.309429568 +0000 UTC m=+116.396212248 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls") pod "image-registry-78fcbc987f-fgm4w" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269") : secret "image-registry-tls" not found Apr 17 08:04:37.049665 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.049631 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk"] Apr 17 08:04:37.053777 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.053755 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" Apr 17 08:04:37.055622 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.055598 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-nbhff\"" Apr 17 08:04:37.061491 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.061468 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk"] Apr 17 08:04:37.116870 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.116835 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9bfr\" (UniqueName: \"kubernetes.io/projected/670156cd-215c-4009-94ea-07e2bf7f784c-kube-api-access-q9bfr\") pod \"network-check-source-8894fc9bd-xbqhk\" (UID: \"670156cd-215c-4009-94ea-07e2bf7f784c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" Apr 17 08:04:37.217271 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.217235 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9bfr\" (UniqueName: \"kubernetes.io/projected/670156cd-215c-4009-94ea-07e2bf7f784c-kube-api-access-q9bfr\") pod \"network-check-source-8894fc9bd-xbqhk\" (UID: \"670156cd-215c-4009-94ea-07e2bf7f784c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" Apr 17 08:04:37.224654 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.224625 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9bfr\" (UniqueName: \"kubernetes.io/projected/670156cd-215c-4009-94ea-07e2bf7f784c-kube-api-access-q9bfr\") pod \"network-check-source-8894fc9bd-xbqhk\" (UID: \"670156cd-215c-4009-94ea-07e2bf7f784c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" Apr 17 08:04:37.284147 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.284110 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nxg4g"] Apr 17 08:04:37.286884 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.286869 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.289143 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.289117 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-ns4gb\"" Apr 17 08:04:37.289271 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.289178 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 08:04:37.289271 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.289203 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 08:04:37.289374 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.289272 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 08:04:37.289522 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.289510 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 08:04:37.295811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.295774 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nxg4g"] Apr 17 08:04:37.363360 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.363272 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" Apr 17 08:04:37.419127 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.419097 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54s66\" (UniqueName: \"kubernetes.io/projected/42567f9e-9c19-445e-9f5e-fea94d754bb7-kube-api-access-54s66\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.419262 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.419241 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42567f9e-9c19-445e-9f5e-fea94d754bb7-signing-cabundle\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.419316 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.419281 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42567f9e-9c19-445e-9f5e-fea94d754bb7-signing-key\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.475192 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.475159 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk"] Apr 17 08:04:37.478381 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:37.478350 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod670156cd_215c_4009_94ea_07e2bf7f784c.slice/crio-d27c4d90cb3dd6df7721371744db5bac38504a32274a2c80af6ba8953892c9e0 WatchSource:0}: Error finding container d27c4d90cb3dd6df7721371744db5bac38504a32274a2c80af6ba8953892c9e0: Status 404 returned error can't find the container with id d27c4d90cb3dd6df7721371744db5bac38504a32274a2c80af6ba8953892c9e0 Apr 17 08:04:37.519670 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.519642 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42567f9e-9c19-445e-9f5e-fea94d754bb7-signing-cabundle\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.519820 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.519688 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42567f9e-9c19-445e-9f5e-fea94d754bb7-signing-key\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.519820 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.519722 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54s66\" (UniqueName: \"kubernetes.io/projected/42567f9e-9c19-445e-9f5e-fea94d754bb7-kube-api-access-54s66\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.520267 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.520246 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42567f9e-9c19-445e-9f5e-fea94d754bb7-signing-cabundle\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.521865 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.521846 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42567f9e-9c19-445e-9f5e-fea94d754bb7-signing-key\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.526374 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.526349 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54s66\" (UniqueName: \"kubernetes.io/projected/42567f9e-9c19-445e-9f5e-fea94d754bb7-kube-api-access-54s66\") pod \"service-ca-865cb79987-nxg4g\" (UID: \"42567f9e-9c19-445e-9f5e-fea94d754bb7\") " pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.596533 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.596499 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nxg4g" Apr 17 08:04:37.714928 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.714891 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nxg4g"] Apr 17 08:04:37.718860 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:37.718834 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42567f9e_9c19_445e_9f5e_fea94d754bb7.slice/crio-a7e95831a5c00ef4e13c648fa864bb4fade6d503577c42501a460bb90618ec0f WatchSource:0}: Error finding container a7e95831a5c00ef4e13c648fa864bb4fade6d503577c42501a460bb90618ec0f: Status 404 returned error can't find the container with id a7e95831a5c00ef4e13c648fa864bb4fade6d503577c42501a460bb90618ec0f Apr 17 08:04:37.875878 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.875839 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nxg4g" event={"ID":"42567f9e-9c19-445e-9f5e-fea94d754bb7","Type":"ContainerStarted","Data":"a7e95831a5c00ef4e13c648fa864bb4fade6d503577c42501a460bb90618ec0f"} Apr 17 08:04:37.877163 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.877141 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" event={"ID":"670156cd-215c-4009-94ea-07e2bf7f784c","Type":"ContainerStarted","Data":"6f33c5b0544edadc58f78e321f65eca074deb5c4ed3863a56c71d172e27c24c3"} Apr 17 08:04:37.877292 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.877169 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" event={"ID":"670156cd-215c-4009-94ea-07e2bf7f784c","Type":"ContainerStarted","Data":"d27c4d90cb3dd6df7721371744db5bac38504a32274a2c80af6ba8953892c9e0"} Apr 17 08:04:37.893335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:37.893285 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xbqhk" podStartSLOduration=0.893268557 podStartE2EDuration="893.268557ms" podCreationTimestamp="2026-04-17 08:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:04:37.891728436 +0000 UTC m=+115.978511121" watchObservedRunningTime="2026-04-17 08:04:37.893268557 +0000 UTC m=+115.980051243" Apr 17 08:04:38.325649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:38.325611 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:38.326141 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:38.325769 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:04:38.326141 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:38.325789 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78fcbc987f-fgm4w: secret "image-registry-tls" not found Apr 17 08:04:38.326141 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:38.325861 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls podName:86dddb50-317b-40e7-897f-3215f2c8b269 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:42.325841688 +0000 UTC m=+120.412624368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls") pod "image-registry-78fcbc987f-fgm4w" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269") : secret "image-registry-tls" not found Apr 17 08:04:39.882992 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:39.882956 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nxg4g" event={"ID":"42567f9e-9c19-445e-9f5e-fea94d754bb7","Type":"ContainerStarted","Data":"bb74e27b0e82bfde8250e339ec6c7789a3a5009a24958b68bb8f48939e5a3aa7"} Apr 17 08:04:39.896408 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:39.896322 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-nxg4g" podStartSLOduration=0.895365897 podStartE2EDuration="2.896304618s" podCreationTimestamp="2026-04-17 08:04:37 +0000 UTC" firstStartedPulling="2026-04-17 08:04:37.72070373 +0000 UTC m=+115.807486396" lastFinishedPulling="2026-04-17 08:04:39.721642454 +0000 UTC m=+117.808425117" observedRunningTime="2026-04-17 08:04:39.896263767 +0000 UTC m=+117.983046452" watchObservedRunningTime="2026-04-17 08:04:39.896304618 +0000 UTC m=+117.983087305" Apr 17 08:04:40.634507 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:40.634471 2583 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:40.634860 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:40.634847 2583 scope.go:117] "RemoveContainer" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" Apr 17 08:04:40.635103 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:40.635061 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:04:42.361095 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:42.361060 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:42.361481 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:42.361129 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:04:42.361481 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:42.361147 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78fcbc987f-fgm4w: secret "image-registry-tls" not found Apr 17 08:04:42.361481 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:42.361193 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls podName:86dddb50-317b-40e7-897f-3215f2c8b269 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:50.36117913 +0000 UTC m=+128.447961797 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls") pod "image-registry-78fcbc987f-fgm4w" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269") : secret "image-registry-tls" not found Apr 17 08:04:42.857105 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:42.857019 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:04:42.857487 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:42.857473 2583 scope.go:117] "RemoveContainer" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" Apr 17 08:04:42.857651 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:42.857635 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:04:50.427873 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.427835 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:50.430296 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.430274 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"image-registry-78fcbc987f-fgm4w\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:50.530964 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.530929 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5m5pl\"" Apr 17 08:04:50.539506 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.539483 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:50.655652 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.655619 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78fcbc987f-fgm4w"] Apr 17 08:04:50.658509 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:50.658479 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86dddb50_317b_40e7_897f_3215f2c8b269.slice/crio-546aec1c5348dcdbf4824ee0a9c2cf9f7ab33f51f6ce735dd67ee5292e9b28ab WatchSource:0}: Error finding container 546aec1c5348dcdbf4824ee0a9c2cf9f7ab33f51f6ce735dd67ee5292e9b28ab: Status 404 returned error can't find the container with id 546aec1c5348dcdbf4824ee0a9c2cf9f7ab33f51f6ce735dd67ee5292e9b28ab Apr 17 08:04:50.921287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.921249 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" event={"ID":"86dddb50-317b-40e7-897f-3215f2c8b269","Type":"ContainerStarted","Data":"3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e"} Apr 17 08:04:50.921287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.921289 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" event={"ID":"86dddb50-317b-40e7-897f-3215f2c8b269","Type":"ContainerStarted","Data":"546aec1c5348dcdbf4824ee0a9c2cf9f7ab33f51f6ce735dd67ee5292e9b28ab"} Apr 17 08:04:50.921503 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.921482 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:04:50.937607 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:50.937563 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" podStartSLOduration=16.937548472 podStartE2EDuration="16.937548472s" podCreationTimestamp="2026-04-17 08:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:04:50.937115884 +0000 UTC m=+129.023898573" watchObservedRunningTime="2026-04-17 08:04:50.937548472 +0000 UTC m=+129.024331156" Apr 17 08:04:52.045642 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.045603 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:52.046136 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.045797 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:52.046417 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.046397 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a60109-bb64-4cd2-9f4e-2987a8942aad-service-ca-bundle\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:52.048118 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.048096 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06a60109-bb64-4cd2-9f4e-2987a8942aad-metrics-certs\") pod \"router-default-7c6d4f57f6-pj78j\" (UID: \"06a60109-bb64-4cd2-9f4e-2987a8942aad\") " pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:52.248444 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.248388 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:04:52.250699 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.250664 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a358126d-f138-41e0-b8a2-598652e544f5-metrics-certs\") pod \"network-metrics-daemon-5zj7l\" (UID: \"a358126d-f138-41e0-b8a2-598652e544f5\") " pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:04:52.321012 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.320932 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-lxgpx\"" Apr 17 08:04:52.329623 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.329602 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:52.429264 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.429238 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x2n8l\"" Apr 17 08:04:52.437644 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.437621 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5zj7l" Apr 17 08:04:52.445996 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.445973 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7c6d4f57f6-pj78j"] Apr 17 08:04:52.449276 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:52.449248 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a60109_bb64_4cd2_9f4e_2987a8942aad.slice/crio-d08953b329b785d7b0fc325d362b60d9e3374ac9fb38e4832f253860ae69e3ab WatchSource:0}: Error finding container d08953b329b785d7b0fc325d362b60d9e3374ac9fb38e4832f253860ae69e3ab: Status 404 returned error can't find the container with id d08953b329b785d7b0fc325d362b60d9e3374ac9fb38e4832f253860ae69e3ab Apr 17 08:04:52.561102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.561075 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5zj7l"] Apr 17 08:04:52.563991 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:52.563967 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda358126d_f138_41e0_b8a2_598652e544f5.slice/crio-2d9b6c0a87e61d136453bbd6398db9b56e9b5d87114382d71b2b39b606e06bc5 WatchSource:0}: Error finding container 2d9b6c0a87e61d136453bbd6398db9b56e9b5d87114382d71b2b39b606e06bc5: Status 404 returned error can't find the container with id 2d9b6c0a87e61d136453bbd6398db9b56e9b5d87114382d71b2b39b606e06bc5 Apr 17 08:04:52.926948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.926909 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5zj7l" event={"ID":"a358126d-f138-41e0-b8a2-598652e544f5","Type":"ContainerStarted","Data":"2d9b6c0a87e61d136453bbd6398db9b56e9b5d87114382d71b2b39b606e06bc5"} Apr 17 08:04:52.928001 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.927978 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" event={"ID":"06a60109-bb64-4cd2-9f4e-2987a8942aad","Type":"ContainerStarted","Data":"eae722518415399aae285c1807b2ce515f7428ab8c15df79b24abb5776179faa"} Apr 17 08:04:52.928138 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.928010 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" event={"ID":"06a60109-bb64-4cd2-9f4e-2987a8942aad","Type":"ContainerStarted","Data":"d08953b329b785d7b0fc325d362b60d9e3374ac9fb38e4832f253860ae69e3ab"} Apr 17 08:04:52.946599 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:52.946543 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" podStartSLOduration=32.946527727 podStartE2EDuration="32.946527727s" podCreationTimestamp="2026-04-17 08:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:04:52.945774152 +0000 UTC m=+131.032556837" watchObservedRunningTime="2026-04-17 08:04:52.946527727 +0000 UTC m=+131.033310413" Apr 17 08:04:53.330885 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.330801 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:53.333923 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.333690 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:53.932061 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.932015 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5zj7l" event={"ID":"a358126d-f138-41e0-b8a2-598652e544f5","Type":"ContainerStarted","Data":"965632aa610175af345aa000736fdc7642e653a798f6fdb51e45ff3c16cf4ea5"} Apr 17 08:04:53.932206 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.932072 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5zj7l" event={"ID":"a358126d-f138-41e0-b8a2-598652e544f5","Type":"ContainerStarted","Data":"f08e40ae0a5f8d0ba07573662400cc7cd52be98d34555935c49bb47054592c3a"} Apr 17 08:04:53.932424 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.932402 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:53.933620 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.933604 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7c6d4f57f6-pj78j" Apr 17 08:04:53.947174 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:53.947116 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5zj7l" podStartSLOduration=130.922987005 podStartE2EDuration="2m11.947103622s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:04:52.565831342 +0000 UTC m=+130.652614010" lastFinishedPulling="2026-04-17 08:04:53.589947948 +0000 UTC m=+131.676730627" observedRunningTime="2026-04-17 08:04:53.946142379 +0000 UTC m=+132.032925088" watchObservedRunningTime="2026-04-17 08:04:53.947103622 +0000 UTC m=+132.033886306" Apr 17 08:04:55.510144 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.510112 2583 scope.go:117] "RemoveContainer" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" Apr 17 08:04:55.938649 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.938622 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/2.log" Apr 17 08:04:55.938950 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.938936 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/1.log" Apr 17 08:04:55.939000 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.938971 2583 generic.go:358] "Generic (PLEG): container finished" podID="7881d774-cc8e-4aa7-852b-0ef081cb8318" containerID="3da4dc88d25fea6a542c2d901c6a399289befaeef72ef8fac3b040ace1242a03" exitCode=255 Apr 17 08:04:55.939085 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.939068 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" event={"ID":"7881d774-cc8e-4aa7-852b-0ef081cb8318","Type":"ContainerDied","Data":"3da4dc88d25fea6a542c2d901c6a399289befaeef72ef8fac3b040ace1242a03"} Apr 17 08:04:55.939126 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.939110 2583 scope.go:117] "RemoveContainer" containerID="6bd44d5eda46691fc31438fc064e27c09830f1fea65be1b573cd481ba810454e" Apr 17 08:04:55.939539 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:55.939522 2583 scope.go:117] "RemoveContainer" containerID="3da4dc88d25fea6a542c2d901c6a399289befaeef72ef8fac3b040ace1242a03" Apr 17 08:04:55.939723 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:04:55.939706 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:04:56.942488 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:56.942464 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/2.log" Apr 17 08:04:58.725707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.725678 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-78fcbc987f-fgm4w"] Apr 17 08:04:58.735764 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.735736 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-thndg"] Apr 17 08:04:58.738620 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.738597 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:58.740708 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.740685 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 08:04:58.740847 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.740743 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 08:04:58.740988 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.740971 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 08:04:58.741265 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.741247 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 08:04:58.741675 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.741660 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5w4p6\"" Apr 17 08:04:58.752364 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.752341 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-thndg"] Apr 17 08:04:58.905215 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.905131 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/865075e4-18ea-4f03-a7e8-c85efbee42bc-data-volume\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:58.905215 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.905182 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/865075e4-18ea-4f03-a7e8-c85efbee42bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:58.905449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.905239 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwn6m\" (UniqueName: \"kubernetes.io/projected/865075e4-18ea-4f03-a7e8-c85efbee42bc-kube-api-access-zwn6m\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:58.905449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.905271 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/865075e4-18ea-4f03-a7e8-c85efbee42bc-crio-socket\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:58.905449 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:58.905349 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/865075e4-18ea-4f03-a7e8-c85efbee42bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006143 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/865075e4-18ea-4f03-a7e8-c85efbee42bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006344 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006171 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/865075e4-18ea-4f03-a7e8-c85efbee42bc-data-volume\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006344 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006206 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/865075e4-18ea-4f03-a7e8-c85efbee42bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006460 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006339 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwn6m\" (UniqueName: \"kubernetes.io/projected/865075e4-18ea-4f03-a7e8-c85efbee42bc-kube-api-access-zwn6m\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006460 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006396 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/865075e4-18ea-4f03-a7e8-c85efbee42bc-crio-socket\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006537 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006496 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/865075e4-18ea-4f03-a7e8-c85efbee42bc-crio-socket\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006630 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006610 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/865075e4-18ea-4f03-a7e8-c85efbee42bc-data-volume\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.006716 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.006700 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/865075e4-18ea-4f03-a7e8-c85efbee42bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.008570 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.008553 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/865075e4-18ea-4f03-a7e8-c85efbee42bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.015597 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.015570 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwn6m\" (UniqueName: \"kubernetes.io/projected/865075e4-18ea-4f03-a7e8-c85efbee42bc-kube-api-access-zwn6m\") pod \"insights-runtime-extractor-thndg\" (UID: \"865075e4-18ea-4f03-a7e8-c85efbee42bc\") " pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.048580 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.048547 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-thndg" Apr 17 08:04:59.184174 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.184098 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-thndg"] Apr 17 08:04:59.187229 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:04:59.187205 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865075e4_18ea_4f03_a7e8_c85efbee42bc.slice/crio-6f75e8958bb410c89dcc6d26053eaf9bbc8cff39eb9f548c833c3dcf5d089836 WatchSource:0}: Error finding container 6f75e8958bb410c89dcc6d26053eaf9bbc8cff39eb9f548c833c3dcf5d089836: Status 404 returned error can't find the container with id 6f75e8958bb410c89dcc6d26053eaf9bbc8cff39eb9f548c833c3dcf5d089836 Apr 17 08:04:59.953028 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.952994 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-thndg" event={"ID":"865075e4-18ea-4f03-a7e8-c85efbee42bc","Type":"ContainerStarted","Data":"17bd7a7ff4ab269476c6fe95c5ee01aa01d7e62e1bc8c2f6051dd1fdf85328fb"} Apr 17 08:04:59.953028 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.953030 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-thndg" event={"ID":"865075e4-18ea-4f03-a7e8-c85efbee42bc","Type":"ContainerStarted","Data":"0b3098aab26618cee3ac8501e00dc6662a53c29dd52666a499d749abee45648f"} Apr 17 08:04:59.953437 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:04:59.953059 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-thndg" event={"ID":"865075e4-18ea-4f03-a7e8-c85efbee42bc","Type":"ContainerStarted","Data":"6f75e8958bb410c89dcc6d26053eaf9bbc8cff39eb9f548c833c3dcf5d089836"} Apr 17 08:05:00.634844 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:00.634810 2583 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:05:00.635195 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:00.635181 2583 scope.go:117] "RemoveContainer" containerID="3da4dc88d25fea6a542c2d901c6a399289befaeef72ef8fac3b040ace1242a03" Apr 17 08:05:00.635348 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:00.635333 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:05:01.925516 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.925479 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-8hrrt"] Apr 17 08:05:01.928676 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.928660 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:01.930636 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.930612 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 08:05:01.930762 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.930727 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 08:05:01.930762 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.930733 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 08:05:01.931311 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.931288 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 08:05:01.931442 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.931426 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 08:05:01.931501 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.931485 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-mh2hj\"" Apr 17 08:05:01.936793 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.936769 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-8hrrt"] Apr 17 08:05:01.960603 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.960564 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-thndg" event={"ID":"865075e4-18ea-4f03-a7e8-c85efbee42bc","Type":"ContainerStarted","Data":"c796969820864a949e6d20aa0262a8a76ae5b57d713c3808b5deec2c2095daed"} Apr 17 08:05:01.977285 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:01.977062 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-thndg" podStartSLOduration=1.783001418 podStartE2EDuration="3.977023699s" podCreationTimestamp="2026-04-17 08:04:58 +0000 UTC" firstStartedPulling="2026-04-17 08:04:59.241921744 +0000 UTC m=+137.328704406" lastFinishedPulling="2026-04-17 08:05:01.435944024 +0000 UTC m=+139.522726687" observedRunningTime="2026-04-17 08:05:01.976189928 +0000 UTC m=+140.062972612" watchObservedRunningTime="2026-04-17 08:05:01.977023699 +0000 UTC m=+140.063806382" Apr 17 08:05:02.029067 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.029003 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.029248 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.029090 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.029248 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.029134 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7e00398-1ae3-4494-af8f-0ca517c6cec3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.029248 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.029168 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7wv\" (UniqueName: \"kubernetes.io/projected/e7e00398-1ae3-4494-af8f-0ca517c6cec3-kube-api-access-8f7wv\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.129612 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.129571 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.129760 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.129636 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.129760 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.129667 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7e00398-1ae3-4494-af8f-0ca517c6cec3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.129760 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.129701 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7wv\" (UniqueName: \"kubernetes.io/projected/e7e00398-1ae3-4494-af8f-0ca517c6cec3-kube-api-access-8f7wv\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.129760 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:02.129711 2583 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 08:05:02.129904 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:02.129803 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-tls podName:e7e00398-1ae3-4494-af8f-0ca517c6cec3 nodeName:}" failed. No retries permitted until 2026-04-17 08:05:02.629782703 +0000 UTC m=+140.716565380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-8hrrt" (UID: "e7e00398-1ae3-4494-af8f-0ca517c6cec3") : secret "prometheus-operator-tls" not found Apr 17 08:05:02.130435 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.130411 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7e00398-1ae3-4494-af8f-0ca517c6cec3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.132102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.132075 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.139632 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.139606 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7wv\" (UniqueName: \"kubernetes.io/projected/e7e00398-1ae3-4494-af8f-0ca517c6cec3-kube-api-access-8f7wv\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.634005 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.633955 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.636392 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.636371 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7e00398-1ae3-4494-af8f-0ca517c6cec3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8hrrt\" (UID: \"e7e00398-1ae3-4494-af8f-0ca517c6cec3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.838014 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.837962 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" Apr 17 08:05:02.857525 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.857496 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:05:02.857914 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.857901 2583 scope.go:117] "RemoveContainer" containerID="3da4dc88d25fea6a542c2d901c6a399289befaeef72ef8fac3b040ace1242a03" Apr 17 08:05:02.858116 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:02.858099 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mnhd4_openshift-console-operator(7881d774-cc8e-4aa7-852b-0ef081cb8318)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" podUID="7881d774-cc8e-4aa7-852b-0ef081cb8318" Apr 17 08:05:02.955846 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.955810 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-8hrrt"] Apr 17 08:05:02.959085 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:02.959052 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e00398_1ae3_4494_af8f_0ca517c6cec3.slice/crio-0a484a45dfeb2e7ab51870d042d00fb66335eaadbeea06abfb0337b5d056098b WatchSource:0}: Error finding container 0a484a45dfeb2e7ab51870d042d00fb66335eaadbeea06abfb0337b5d056098b: Status 404 returned error can't find the container with id 0a484a45dfeb2e7ab51870d042d00fb66335eaadbeea06abfb0337b5d056098b Apr 17 08:05:02.964006 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:02.963974 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" event={"ID":"e7e00398-1ae3-4494-af8f-0ca517c6cec3","Type":"ContainerStarted","Data":"0a484a45dfeb2e7ab51870d042d00fb66335eaadbeea06abfb0337b5d056098b"} Apr 17 08:05:04.970741 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:04.970704 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" event={"ID":"e7e00398-1ae3-4494-af8f-0ca517c6cec3","Type":"ContainerStarted","Data":"245b387006bf8f2ec9459ef0f435538a780533742050131090f15028993cd0d2"} Apr 17 08:05:04.970741 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:04.970739 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" event={"ID":"e7e00398-1ae3-4494-af8f-0ca517c6cec3","Type":"ContainerStarted","Data":"03edffa30fb092881f7bc1385ea6913e7f06a31bc1f843edeaa29058bc573f8c"} Apr 17 08:05:04.986338 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:04.986285 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-8hrrt" podStartSLOduration=2.650439102 podStartE2EDuration="3.986269812s" podCreationTimestamp="2026-04-17 08:05:01 +0000 UTC" firstStartedPulling="2026-04-17 08:05:02.960961454 +0000 UTC m=+141.047744120" lastFinishedPulling="2026-04-17 08:05:04.296792167 +0000 UTC m=+142.383574830" observedRunningTime="2026-04-17 08:05:04.98465051 +0000 UTC m=+143.071433197" watchObservedRunningTime="2026-04-17 08:05:04.986269812 +0000 UTC m=+143.073052496" Apr 17 08:05:07.270026 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.269987 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dt96l"] Apr 17 08:05:07.303748 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.303710 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.305800 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.305750 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 08:05:07.306026 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.306003 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 08:05:07.306118 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.306074 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hwh7w\"" Apr 17 08:05:07.306173 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.306110 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 08:05:07.371884 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.371806 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.372180 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372161 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-sys\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.372349 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372335 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-root\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.372499 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372485 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-textfile\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.372638 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372625 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-accelerators-collector-config\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.372795 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372774 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfj5d\" (UniqueName: \"kubernetes.io/projected/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-kube-api-access-zfj5d\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.372947 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372922 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-tls\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.373098 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372955 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-metrics-client-ca\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.373098 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.372995 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-wtmp\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.473755 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473716 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-textfile\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.473755 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473761 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-accelerators-collector-config\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473780 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfj5d\" (UniqueName: \"kubernetes.io/projected/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-kube-api-access-zfj5d\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473798 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-tls\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473824 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-metrics-client-ca\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473857 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-wtmp\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473920 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473945 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-sys\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474017 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.473995 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-root\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474381 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.474094 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-root\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474381 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.474175 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-textfile\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474381 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.474275 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-wtmp\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474544 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.474481 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-accelerators-collector-config\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474544 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.474531 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-sys\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.474762 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.474738 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-metrics-client-ca\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.476596 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.476574 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.476686 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.476603 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-node-exporter-tls\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.494227 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.494193 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfj5d\" (UniqueName: \"kubernetes.io/projected/2d9d6a08-d2cb-498a-8d4b-777a008e3ef8-kube-api-access-zfj5d\") pod \"node-exporter-dt96l\" (UID: \"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8\") " pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.615590 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.615498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dt96l" Apr 17 08:05:07.626292 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:07.626252 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9d6a08_d2cb_498a_8d4b_777a008e3ef8.slice/crio-c6bff79d5fe150f73e15682cc0a13381f6ca142dc75fe2a4af0bb1fe77cc539e WatchSource:0}: Error finding container c6bff79d5fe150f73e15682cc0a13381f6ca142dc75fe2a4af0bb1fe77cc539e: Status 404 returned error can't find the container with id c6bff79d5fe150f73e15682cc0a13381f6ca142dc75fe2a4af0bb1fe77cc539e Apr 17 08:05:07.978359 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:07.978323 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dt96l" event={"ID":"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8","Type":"ContainerStarted","Data":"c6bff79d5fe150f73e15682cc0a13381f6ca142dc75fe2a4af0bb1fe77cc539e"} Apr 17 08:05:08.731637 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:08.731597 2583 patch_prober.go:28] interesting pod/image-registry-78fcbc987f-fgm4w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:08.732000 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:08.731650 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:08.982395 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:08.982354 2583 generic.go:358] "Generic (PLEG): container finished" podID="2d9d6a08-d2cb-498a-8d4b-777a008e3ef8" containerID="036b32d9aa217b296896750b4d07410c67274f76a3173cfa7d335b197919d3b0" exitCode=0 Apr 17 08:05:08.982566 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:08.982434 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dt96l" event={"ID":"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8","Type":"ContainerDied","Data":"036b32d9aa217b296896750b4d07410c67274f76a3173cfa7d335b197919d3b0"} Apr 17 08:05:09.260835 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.260759 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-74455479b4-qx2xr"] Apr 17 08:05:09.264399 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.264378 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.266430 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266390 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 08:05:09.266549 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266396 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 08:05:09.266549 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266530 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 08:05:09.266662 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266538 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-t929g\"" Apr 17 08:05:09.266712 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266674 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 08:05:09.266811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266794 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 08:05:09.266875 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.266851 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-uenfdrgaoosg\"" Apr 17 08:05:09.274983 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.274959 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-74455479b4-qx2xr"] Apr 17 08:05:09.287881 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.287854 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288021 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.287891 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288021 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.287913 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8wrg\" (UniqueName: \"kubernetes.io/projected/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-kube-api-access-x8wrg\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.288051 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-tls\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.288090 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288186 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.288109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-grpc-tls\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288186 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.288128 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-metrics-client-ca\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.288186 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.288158 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.388836 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388792 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.388836 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388837 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388857 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8wrg\" (UniqueName: \"kubernetes.io/projected/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-kube-api-access-x8wrg\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388918 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-tls\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388937 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388953 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-grpc-tls\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.388971 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-metrics-client-ca\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389111 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.389000 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.389833 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.389779 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-metrics-client-ca\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.391457 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.391429 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.391733 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.391703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-tls\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.391849 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.391747 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.391914 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.391846 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.391914 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.391893 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.392231 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.392213 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-secret-grpc-tls\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.398898 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.398878 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8wrg\" (UniqueName: \"kubernetes.io/projected/9b420cbb-74d5-4ac1-9919-a46aedbcd9c7-kube-api-access-x8wrg\") pod \"thanos-querier-74455479b4-qx2xr\" (UID: \"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7\") " pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.573685 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.573588 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:09.697307 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.697261 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-74455479b4-qx2xr"] Apr 17 08:05:09.700934 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:09.700907 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b420cbb_74d5_4ac1_9919_a46aedbcd9c7.slice/crio-dd6d4f70a46210866a57435052c38f2baac12293c2441985504cc3ab69f25a13 WatchSource:0}: Error finding container dd6d4f70a46210866a57435052c38f2baac12293c2441985504cc3ab69f25a13: Status 404 returned error can't find the container with id dd6d4f70a46210866a57435052c38f2baac12293c2441985504cc3ab69f25a13 Apr 17 08:05:09.985609 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.985573 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"dd6d4f70a46210866a57435052c38f2baac12293c2441985504cc3ab69f25a13"} Apr 17 08:05:09.987251 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.987227 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dt96l" event={"ID":"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8","Type":"ContainerStarted","Data":"8a3c5307afe1d8d812d186fa5a3252a7a136232e011654a9c2e4f7cc650469d1"} Apr 17 08:05:09.987251 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:09.987255 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dt96l" event={"ID":"2d9d6a08-d2cb-498a-8d4b-777a008e3ef8","Type":"ContainerStarted","Data":"e890577fd2bdf12845b1e3fa58ba12c4244dca0026278632d403d7fe2b426b5b"} Apr 17 08:05:10.005175 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:10.005134 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dt96l" podStartSLOduration=2.13170206 podStartE2EDuration="3.005119035s" podCreationTimestamp="2026-04-17 08:05:07 +0000 UTC" firstStartedPulling="2026-04-17 08:05:07.628076341 +0000 UTC m=+145.714859010" lastFinishedPulling="2026-04-17 08:05:08.501493318 +0000 UTC m=+146.588275985" observedRunningTime="2026-04-17 08:05:10.003435356 +0000 UTC m=+148.090218041" watchObservedRunningTime="2026-04-17 08:05:10.005119035 +0000 UTC m=+148.091901722" Apr 17 08:05:11.997106 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:11.996978 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"205441911b3879788e7373920abcbcd0d34badd02031328fb0df27d787dbb56e"} Apr 17 08:05:11.997106 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:11.997062 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"8d4ca8c9924bbcd86f888a0d7ca8bf404b050ef560a9d56cbad954560a98a961"} Apr 17 08:05:11.997106 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:11.997077 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"5306bc2be29ae3452d500e6aec953eaf413656cedd2942f0741e561f248e9c6f"} Apr 17 08:05:12.055176 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.055137 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk"] Apr 17 08:05:12.058803 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.058755 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:12.060707 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.060684 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 08:05:12.060854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.060691 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-562lz\"" Apr 17 08:05:12.063926 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.063900 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk"] Apr 17 08:05:12.115102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.115030 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/531f37d5-f8e4-4699-9a6c-ba7867ad1c9b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v85bk\" (UID: \"531f37d5-f8e4-4699-9a6c-ba7867ad1c9b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:12.216400 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.216355 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/531f37d5-f8e4-4699-9a6c-ba7867ad1c9b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v85bk\" (UID: \"531f37d5-f8e4-4699-9a6c-ba7867ad1c9b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:12.216558 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:12.216526 2583 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 08:05:12.216630 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:12.216589 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531f37d5-f8e4-4699-9a6c-ba7867ad1c9b-monitoring-plugin-cert podName:531f37d5-f8e4-4699-9a6c-ba7867ad1c9b nodeName:}" failed. No retries permitted until 2026-04-17 08:05:12.716574163 +0000 UTC m=+150.803356831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/531f37d5-f8e4-4699-9a6c-ba7867ad1c9b-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-v85bk" (UID: "531f37d5-f8e4-4699-9a6c-ba7867ad1c9b") : secret "monitoring-plugin-cert" not found Apr 17 08:05:12.720551 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.720515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/531f37d5-f8e4-4699-9a6c-ba7867ad1c9b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v85bk\" (UID: \"531f37d5-f8e4-4699-9a6c-ba7867ad1c9b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:12.723464 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.723433 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/531f37d5-f8e4-4699-9a6c-ba7867ad1c9b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-v85bk\" (UID: \"531f37d5-f8e4-4699-9a6c-ba7867ad1c9b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:12.970404 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:12.970375 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:13.003362 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:13.003323 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"2c035f73bd2150fb1939a677c5f4649232302c9578cdf1bb9cbc9100f53c1a78"} Apr 17 08:05:13.003362 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:13.003368 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"837f4c13b5ec6ef587c8961e966c9f29b088891318276b0528d758be34ccc8ca"} Apr 17 08:05:13.003855 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:13.003381 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" event={"ID":"9b420cbb-74d5-4ac1-9919-a46aedbcd9c7","Type":"ContainerStarted","Data":"77b4da91617b44c26d365fe943efa4adbb1874183ad2ed27dc7cdd2f5d7d1a9e"} Apr 17 08:05:13.003855 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:13.003561 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:13.024656 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:13.024570 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" podStartSLOduration=0.922372667 podStartE2EDuration="4.024547576s" podCreationTimestamp="2026-04-17 08:05:09 +0000 UTC" firstStartedPulling="2026-04-17 08:05:09.702847364 +0000 UTC m=+147.789630027" lastFinishedPulling="2026-04-17 08:05:12.805022269 +0000 UTC m=+150.891804936" observedRunningTime="2026-04-17 08:05:13.021980811 +0000 UTC m=+151.108763522" watchObservedRunningTime="2026-04-17 08:05:13.024547576 +0000 UTC m=+151.111330262" Apr 17 08:05:13.096773 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:13.096737 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk"] Apr 17 08:05:13.100398 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:13.100374 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531f37d5_f8e4_4699_9a6c_ba7867ad1c9b.slice/crio-1ff441e16366b224237d6a9afc98a9f013759e54d914c2ef80a899bb765f5ad2 WatchSource:0}: Error finding container 1ff441e16366b224237d6a9afc98a9f013759e54d914c2ef80a899bb765f5ad2: Status 404 returned error can't find the container with id 1ff441e16366b224237d6a9afc98a9f013759e54d914c2ef80a899bb765f5ad2 Apr 17 08:05:14.008263 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:14.008228 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" event={"ID":"531f37d5-f8e4-4699-9a6c-ba7867ad1c9b","Type":"ContainerStarted","Data":"1ff441e16366b224237d6a9afc98a9f013759e54d914c2ef80a899bb765f5ad2"} Apr 17 08:05:15.012317 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:15.012278 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" event={"ID":"531f37d5-f8e4-4699-9a6c-ba7867ad1c9b","Type":"ContainerStarted","Data":"1d1bd3b4adcbc30508ab900df324ab0a3c4f9a24433606aeedc0b5d4be878d98"} Apr 17 08:05:15.012675 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:15.012486 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:15.017310 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:15.017287 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" Apr 17 08:05:15.025397 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:15.025359 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-v85bk" podStartSLOduration=1.714710485 podStartE2EDuration="3.025346855s" podCreationTimestamp="2026-04-17 08:05:12 +0000 UTC" firstStartedPulling="2026-04-17 08:05:13.102245171 +0000 UTC m=+151.189027837" lastFinishedPulling="2026-04-17 08:05:14.412881541 +0000 UTC m=+152.499664207" observedRunningTime="2026-04-17 08:05:15.025209308 +0000 UTC m=+153.111992006" watchObservedRunningTime="2026-04-17 08:05:15.025346855 +0000 UTC m=+153.112129540" Apr 17 08:05:17.510184 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:17.510156 2583 scope.go:117] "RemoveContainer" containerID="3da4dc88d25fea6a542c2d901c6a399289befaeef72ef8fac3b040ace1242a03" Apr 17 08:05:18.021488 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.021461 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/2.log" Apr 17 08:05:18.021665 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.021527 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" event={"ID":"7881d774-cc8e-4aa7-852b-0ef081cb8318","Type":"ContainerStarted","Data":"f8e0d27e29a27397a250f89cc76b72f37a354619e46f5b838e12dab88e896ac6"} Apr 17 08:05:18.021816 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.021786 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:05:18.360049 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:18.359923 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-65v7n" podUID="d198e1b7-2f54-4936-a0be-c6a8e9e20ea7" Apr 17 08:05:18.374264 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:18.374211 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-m7zgj" podUID="b4ebbaa6-1054-4eaf-87ba-d84dc8af620f" Apr 17 08:05:18.646619 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.646590 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-mnhd4" Apr 17 08:05:18.730051 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.730016 2583 patch_prober.go:28] interesting pod/image-registry-78fcbc987f-fgm4w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:18.730193 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.730080 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:18.838417 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.838376 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-8pfw7"] Apr 17 08:05:18.842150 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.842121 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:18.843915 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.843885 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 08:05:18.844180 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.844165 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-pnf9l\"" Apr 17 08:05:18.844267 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.844210 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 08:05:18.850471 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.850448 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8pfw7"] Apr 17 08:05:18.975665 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:18.975625 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjb2t\" (UniqueName: \"kubernetes.io/projected/c704e155-2e47-4095-b42d-9a554fe2e495-kube-api-access-jjb2t\") pod \"downloads-6bcc868b7-8pfw7\" (UID: \"c704e155-2e47-4095-b42d-9a554fe2e495\") " pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:19.014062 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:19.014015 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-74455479b4-qx2xr" Apr 17 08:05:19.024351 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:19.024320 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65v7n" Apr 17 08:05:19.076494 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:19.076424 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjb2t\" (UniqueName: \"kubernetes.io/projected/c704e155-2e47-4095-b42d-9a554fe2e495-kube-api-access-jjb2t\") pod \"downloads-6bcc868b7-8pfw7\" (UID: \"c704e155-2e47-4095-b42d-9a554fe2e495\") " pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:19.083703 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:19.083677 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjb2t\" (UniqueName: \"kubernetes.io/projected/c704e155-2e47-4095-b42d-9a554fe2e495-kube-api-access-jjb2t\") pod \"downloads-6bcc868b7-8pfw7\" (UID: \"c704e155-2e47-4095-b42d-9a554fe2e495\") " pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:19.151690 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:19.151656 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:19.272236 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:19.272200 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8pfw7"] Apr 17 08:05:19.277505 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:19.277477 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc704e155_2e47_4095_b42d_9a554fe2e495.slice/crio-615733fd67432cbfb05fdf14db891ef0087031c6ba02f6f81677309e5aef32a5 WatchSource:0}: Error finding container 615733fd67432cbfb05fdf14db891ef0087031c6ba02f6f81677309e5aef32a5: Status 404 returned error can't find the container with id 615733fd67432cbfb05fdf14db891ef0087031c6ba02f6f81677309e5aef32a5 Apr 17 08:05:20.032277 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:20.032239 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8pfw7" event={"ID":"c704e155-2e47-4095-b42d-9a554fe2e495","Type":"ContainerStarted","Data":"615733fd67432cbfb05fdf14db891ef0087031c6ba02f6f81677309e5aef32a5"} Apr 17 08:05:23.213009 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.212974 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:05:23.215914 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.215886 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d198e1b7-2f54-4936-a0be-c6a8e9e20ea7-metrics-tls\") pod \"dns-default-65v7n\" (UID: \"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7\") " pod="openshift-dns/dns-default-65v7n" Apr 17 08:05:23.228197 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.228174 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wdhxs\"" Apr 17 08:05:23.236716 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.236692 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-65v7n" Apr 17 08:05:23.315173 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.315139 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:05:23.317738 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.317714 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4ebbaa6-1054-4eaf-87ba-d84dc8af620f-cert\") pod \"ingress-canary-m7zgj\" (UID: \"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f\") " pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:05:23.374055 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.374009 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-65v7n"] Apr 17 08:05:23.376910 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:23.376876 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd198e1b7_2f54_4936_a0be_c6a8e9e20ea7.slice/crio-052af40032dbaea80555eaf58cdcf9670b4e8993c0118d8aaa04ba53ce7fbee3 WatchSource:0}: Error finding container 052af40032dbaea80555eaf58cdcf9670b4e8993c0118d8aaa04ba53ce7fbee3: Status 404 returned error can't find the container with id 052af40032dbaea80555eaf58cdcf9670b4e8993c0118d8aaa04ba53ce7fbee3 Apr 17 08:05:23.747191 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:23.746953 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" containerName="registry" containerID="cri-o://3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e" gracePeriod=30 Apr 17 08:05:24.008329 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.008300 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:05:24.046991 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.046954 2583 generic.go:358] "Generic (PLEG): container finished" podID="86dddb50-317b-40e7-897f-3215f2c8b269" containerID="3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e" exitCode=0 Apr 17 08:05:24.047180 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.047060 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" Apr 17 08:05:24.047180 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.047060 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" event={"ID":"86dddb50-317b-40e7-897f-3215f2c8b269","Type":"ContainerDied","Data":"3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e"} Apr 17 08:05:24.047180 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.047104 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-78fcbc987f-fgm4w" event={"ID":"86dddb50-317b-40e7-897f-3215f2c8b269","Type":"ContainerDied","Data":"546aec1c5348dcdbf4824ee0a9c2cf9f7ab33f51f6ce735dd67ee5292e9b28ab"} Apr 17 08:05:24.047180 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.047134 2583 scope.go:117] "RemoveContainer" containerID="3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e" Apr 17 08:05:24.048785 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.048747 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65v7n" event={"ID":"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7","Type":"ContainerStarted","Data":"052af40032dbaea80555eaf58cdcf9670b4e8993c0118d8aaa04ba53ce7fbee3"} Apr 17 08:05:24.057761 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.057741 2583 scope.go:117] "RemoveContainer" containerID="3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e" Apr 17 08:05:24.058251 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:05:24.058219 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e\": container with ID starting with 3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e not found: ID does not exist" containerID="3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e" Apr 17 08:05:24.058361 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.058263 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e"} err="failed to get container status \"3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e\": rpc error: code = NotFound desc = could not find container \"3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e\": container with ID starting with 3c350ae7d45e40fdd5448213f04b1c71240401a7467a2f1b040e42c277555b4e not found: ID does not exist" Apr 17 08:05:24.122633 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122597 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86dddb50-317b-40e7-897f-3215f2c8b269-ca-trust-extracted\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.122811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122655 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.122811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122703 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-bound-sa-token\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.122811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122737 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-trusted-ca\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.122811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122770 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-image-registry-private-configuration\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.122811 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122800 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-installation-pull-secrets\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.123100 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122843 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-545sm\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-kube-api-access-545sm\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.123100 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.122871 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-registry-certificates\") pod \"86dddb50-317b-40e7-897f-3215f2c8b269\" (UID: \"86dddb50-317b-40e7-897f-3215f2c8b269\") " Apr 17 08:05:24.123383 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.123351 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:05:24.123737 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.123686 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:05:24.126615 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.126565 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-kube-api-access-545sm" (OuterVolumeSpecName: "kube-api-access-545sm") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "kube-api-access-545sm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:05:24.127166 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.127120 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:05:24.127412 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.127372 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:05:24.128189 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.128140 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:05:24.128737 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.128705 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:05:24.134922 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.134898 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dddb50-317b-40e7-897f-3215f2c8b269-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "86dddb50-317b-40e7-897f-3215f2c8b269" (UID: "86dddb50-317b-40e7-897f-3215f2c8b269"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:05:24.224008 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.223968 2583 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86dddb50-317b-40e7-897f-3215f2c8b269-ca-trust-extracted\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224008 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224005 2583 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-registry-tls\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224021 2583 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-bound-sa-token\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224056 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-trusted-ca\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224072 2583 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-image-registry-private-configuration\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224087 2583 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86dddb50-317b-40e7-897f-3215f2c8b269-installation-pull-secrets\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224102 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-545sm\" (UniqueName: \"kubernetes.io/projected/86dddb50-317b-40e7-897f-3215f2c8b269-kube-api-access-545sm\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.224443 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.224118 2583 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86dddb50-317b-40e7-897f-3215f2c8b269-registry-certificates\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:05:24.372181 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.372152 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-78fcbc987f-fgm4w"] Apr 17 08:05:24.380666 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.380640 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-78fcbc987f-fgm4w"] Apr 17 08:05:24.515271 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:24.515230 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" path="/var/lib/kubelet/pods/86dddb50-317b-40e7-897f-3215f2c8b269/volumes" Apr 17 08:05:25.053716 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:25.053674 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65v7n" event={"ID":"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7","Type":"ContainerStarted","Data":"c862ee5d8391c6ad6bffbf211f10f31734854a24722cd1658805368f2586124e"} Apr 17 08:05:25.053716 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:25.053718 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-65v7n" event={"ID":"d198e1b7-2f54-4936-a0be-c6a8e9e20ea7","Type":"ContainerStarted","Data":"80ce866f771a920c53c95cd097c14358558970de8e127a8e5472954b4e93c21d"} Apr 17 08:05:25.053909 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:25.053795 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-65v7n" Apr 17 08:05:25.068799 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:25.068749 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-65v7n" podStartSLOduration=128.80498011 podStartE2EDuration="2m10.068733727s" podCreationTimestamp="2026-04-17 08:03:15 +0000 UTC" firstStartedPulling="2026-04-17 08:05:23.378888966 +0000 UTC m=+161.465671632" lastFinishedPulling="2026-04-17 08:05:24.642642581 +0000 UTC m=+162.729425249" observedRunningTime="2026-04-17 08:05:25.067262486 +0000 UTC m=+163.154045182" watchObservedRunningTime="2026-04-17 08:05:25.068733727 +0000 UTC m=+163.155516417" Apr 17 08:05:27.944708 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.944673 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dc775c747-rfw2g"] Apr 17 08:05:27.945252 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.944982 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" containerName="registry" Apr 17 08:05:27.945252 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.944999 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" containerName="registry" Apr 17 08:05:27.945252 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.945076 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="86dddb50-317b-40e7-897f-3215f2c8b269" containerName="registry" Apr 17 08:05:27.948392 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.948372 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:27.950391 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.950371 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 08:05:27.950502 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.950394 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 08:05:27.950502 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.950454 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 08:05:27.950949 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.950932 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bzbl4\"" Apr 17 08:05:27.951055 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.950990 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 08:05:27.951186 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.951164 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 08:05:27.957538 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:27.957506 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc775c747-rfw2g"] Apr 17 08:05:28.054966 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.054921 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-oauth-serving-cert\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.055169 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.054982 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-oauth-config\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.055169 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.055012 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-service-ca\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.055169 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.055156 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8hwp\" (UniqueName: \"kubernetes.io/projected/7c280baf-dcfe-42ea-8adb-6f50a15e2311-kube-api-access-n8hwp\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.055340 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.055196 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-serving-cert\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.055340 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.055262 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-config\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.156090 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-oauth-config\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.156276 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156108 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-service-ca\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.156276 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156175 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8hwp\" (UniqueName: \"kubernetes.io/projected/7c280baf-dcfe-42ea-8adb-6f50a15e2311-kube-api-access-n8hwp\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.156276 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156209 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-serving-cert\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.156276 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156240 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-config\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.156444 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156299 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-oauth-serving-cert\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.157144 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156892 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-service-ca\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.157144 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.156937 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-oauth-serving-cert\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.157144 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.157100 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-config\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.158985 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.158960 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-oauth-config\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.159100 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.159013 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-serving-cert\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.168001 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.167973 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8hwp\" (UniqueName: \"kubernetes.io/projected/7c280baf-dcfe-42ea-8adb-6f50a15e2311-kube-api-access-n8hwp\") pod \"console-6dc775c747-rfw2g\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.259599 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.259507 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:28.413989 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:28.413803 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc775c747-rfw2g"] Apr 17 08:05:28.416879 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:28.416845 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c280baf_dcfe_42ea_8adb_6f50a15e2311.slice/crio-385bb3e2656067a691d794c7b4576504849c20c74b877a76b560f396b50c5826 WatchSource:0}: Error finding container 385bb3e2656067a691d794c7b4576504849c20c74b877a76b560f396b50c5826: Status 404 returned error can't find the container with id 385bb3e2656067a691d794c7b4576504849c20c74b877a76b560f396b50c5826 Apr 17 08:05:29.067242 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:29.067171 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc775c747-rfw2g" event={"ID":"7c280baf-dcfe-42ea-8adb-6f50a15e2311","Type":"ContainerStarted","Data":"385bb3e2656067a691d794c7b4576504849c20c74b877a76b560f396b50c5826"} Apr 17 08:05:33.509992 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:33.509959 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:05:33.512110 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:33.512084 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x9bwm\"" Apr 17 08:05:33.521353 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:33.521318 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m7zgj" Apr 17 08:05:35.059641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:35.059609 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-65v7n" Apr 17 08:05:37.132503 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.132469 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54b66fb959-hjgwc"] Apr 17 08:05:37.139032 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.139007 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.145509 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.145461 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54b66fb959-hjgwc"] Apr 17 08:05:37.145826 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.145802 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 08:05:37.236276 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236233 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-222jl\" (UniqueName: \"kubernetes.io/projected/a5898bf4-855b-4f8e-8d90-6d79b1582535-kube-api-access-222jl\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.236453 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236312 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-oauth-config\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.236453 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236356 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-oauth-serving-cert\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.236453 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236375 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-service-ca\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.236453 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236413 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-serving-cert\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.236629 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236496 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-trusted-ca-bundle\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.236629 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.236567 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-config\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337328 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-oauth-serving-cert\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337328 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337331 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-service-ca\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337587 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337368 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-serving-cert\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337587 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337398 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-trusted-ca-bundle\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337587 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-config\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337587 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337455 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-222jl\" (UniqueName: \"kubernetes.io/projected/a5898bf4-855b-4f8e-8d90-6d79b1582535-kube-api-access-222jl\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.337587 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.337478 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-oauth-config\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.338243 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.338185 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-service-ca\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.338243 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.338210 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-oauth-serving-cert\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.338437 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.338249 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-config\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.338437 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.338381 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-trusted-ca-bundle\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.340307 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.340271 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-serving-cert\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.340307 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.340271 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-oauth-config\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.345465 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.345441 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-222jl\" (UniqueName: \"kubernetes.io/projected/a5898bf4-855b-4f8e-8d90-6d79b1582535-kube-api-access-222jl\") pod \"console-54b66fb959-hjgwc\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:37.451250 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:37.451202 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:38.562059 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:38.561906 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m7zgj"] Apr 17 08:05:38.566130 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:38.566090 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ebbaa6_1054_4eaf_87ba_d84dc8af620f.slice/crio-3fd1586fd4d2a6a6d2af2a41b93f43b1c7891b283669731974c59403f8a27498 WatchSource:0}: Error finding container 3fd1586fd4d2a6a6d2af2a41b93f43b1c7891b283669731974c59403f8a27498: Status 404 returned error can't find the container with id 3fd1586fd4d2a6a6d2af2a41b93f43b1c7891b283669731974c59403f8a27498 Apr 17 08:05:38.574404 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:38.574369 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54b66fb959-hjgwc"] Apr 17 08:05:38.578537 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:05:38.578498 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5898bf4_855b_4f8e_8d90_6d79b1582535.slice/crio-f311209d621945eff6001f7a1342ae8801101a77de0c70256be5a364fff4ca03 WatchSource:0}: Error finding container f311209d621945eff6001f7a1342ae8801101a77de0c70256be5a364fff4ca03: Status 404 returned error can't find the container with id f311209d621945eff6001f7a1342ae8801101a77de0c70256be5a364fff4ca03 Apr 17 08:05:39.102680 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.102576 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc775c747-rfw2g" event={"ID":"7c280baf-dcfe-42ea-8adb-6f50a15e2311","Type":"ContainerStarted","Data":"a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e"} Apr 17 08:05:39.104104 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.104065 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m7zgj" event={"ID":"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f","Type":"ContainerStarted","Data":"3fd1586fd4d2a6a6d2af2a41b93f43b1c7891b283669731974c59403f8a27498"} Apr 17 08:05:39.106023 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.105895 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b66fb959-hjgwc" event={"ID":"a5898bf4-855b-4f8e-8d90-6d79b1582535","Type":"ContainerStarted","Data":"4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641"} Apr 17 08:05:39.106023 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.105925 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b66fb959-hjgwc" event={"ID":"a5898bf4-855b-4f8e-8d90-6d79b1582535","Type":"ContainerStarted","Data":"f311209d621945eff6001f7a1342ae8801101a77de0c70256be5a364fff4ca03"} Apr 17 08:05:39.107790 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.107764 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8pfw7" event={"ID":"c704e155-2e47-4095-b42d-9a554fe2e495","Type":"ContainerStarted","Data":"1ba555957726e2efd1a4e8b2eab063cfc88a2efa6b59871a2ee941fed806e246"} Apr 17 08:05:39.108464 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.108415 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:39.119490 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.119455 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-8pfw7" Apr 17 08:05:39.123173 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.123115 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dc775c747-rfw2g" podStartSLOduration=2.130448088 podStartE2EDuration="12.123097019s" podCreationTimestamp="2026-04-17 08:05:27 +0000 UTC" firstStartedPulling="2026-04-17 08:05:28.419149026 +0000 UTC m=+166.505931701" lastFinishedPulling="2026-04-17 08:05:38.411797958 +0000 UTC m=+176.498580632" observedRunningTime="2026-04-17 08:05:39.120457822 +0000 UTC m=+177.207240508" watchObservedRunningTime="2026-04-17 08:05:39.123097019 +0000 UTC m=+177.209879705" Apr 17 08:05:39.135607 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.135544 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-8pfw7" podStartSLOduration=1.955745442 podStartE2EDuration="21.135525354s" podCreationTimestamp="2026-04-17 08:05:18 +0000 UTC" firstStartedPulling="2026-04-17 08:05:19.27941057 +0000 UTC m=+157.366193233" lastFinishedPulling="2026-04-17 08:05:38.459190471 +0000 UTC m=+176.545973145" observedRunningTime="2026-04-17 08:05:39.13355241 +0000 UTC m=+177.220335094" watchObservedRunningTime="2026-04-17 08:05:39.135525354 +0000 UTC m=+177.222308040" Apr 17 08:05:39.150829 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:39.150108 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54b66fb959-hjgwc" podStartSLOduration=2.150085652 podStartE2EDuration="2.150085652s" podCreationTimestamp="2026-04-17 08:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:05:39.148604567 +0000 UTC m=+177.235387268" watchObservedRunningTime="2026-04-17 08:05:39.150085652 +0000 UTC m=+177.236868338" Apr 17 08:05:41.117975 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:41.117882 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m7zgj" event={"ID":"b4ebbaa6-1054-4eaf-87ba-d84dc8af620f","Type":"ContainerStarted","Data":"936ed022532313e7e82334ea9bff075176ec330d63d5e538f4d23f4475777715"} Apr 17 08:05:41.133034 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:41.132976 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m7zgj" podStartSLOduration=143.868691618 podStartE2EDuration="2m26.132955898s" podCreationTimestamp="2026-04-17 08:03:15 +0000 UTC" firstStartedPulling="2026-04-17 08:05:38.568653586 +0000 UTC m=+176.655436252" lastFinishedPulling="2026-04-17 08:05:40.83291787 +0000 UTC m=+178.919700532" observedRunningTime="2026-04-17 08:05:41.131216711 +0000 UTC m=+179.217999397" watchObservedRunningTime="2026-04-17 08:05:41.132955898 +0000 UTC m=+179.219738583" Apr 17 08:05:41.236548 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:41.236508 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-65v7n_d198e1b7-2f54-4936-a0be-c6a8e9e20ea7/dns/0.log" Apr 17 08:05:41.436596 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:41.436561 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-65v7n_d198e1b7-2f54-4936-a0be-c6a8e9e20ea7/kube-rbac-proxy/0.log" Apr 17 08:05:42.236648 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:42.236621 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q5zfz_ad1b5fde-cf28-4a54-bb33-cb43f425421e/dns-node-resolver/0.log" Apr 17 08:05:43.236370 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:43.236340 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pk77j_5f3f7f47-4941-40cd-88d3-259605376e0e/node-ca/0.log" Apr 17 08:05:43.636741 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:43.636709 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7c6d4f57f6-pj78j_06a60109-bb64-4cd2-9f4e-2987a8942aad/router/0.log" Apr 17 08:05:44.036507 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:44.036429 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-m7zgj_b4ebbaa6-1054-4eaf-87ba-d84dc8af620f/serve-healthcheck-canary/0.log" Apr 17 08:05:47.452179 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:47.452126 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:47.452179 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:47.452190 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:47.457678 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:47.457644 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:48.145218 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:48.145180 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:05:48.186289 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:48.186251 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dc775c747-rfw2g"] Apr 17 08:05:48.260574 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:48.260539 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:05:59.177525 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:59.177487 2583 generic.go:358] "Generic (PLEG): container finished" podID="2271c85d-959b-40a9-aaca-4f7851b44b73" containerID="3ebae85485bb665654de44527998d15b02c413eded06c807a0e9cfca6fc3e1b8" exitCode=0 Apr 17 08:05:59.177937 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:59.177563 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" event={"ID":"2271c85d-959b-40a9-aaca-4f7851b44b73","Type":"ContainerDied","Data":"3ebae85485bb665654de44527998d15b02c413eded06c807a0e9cfca6fc3e1b8"} Apr 17 08:05:59.177937 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:05:59.177881 2583 scope.go:117] "RemoveContainer" containerID="3ebae85485bb665654de44527998d15b02c413eded06c807a0e9cfca6fc3e1b8" Apr 17 08:06:00.181971 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:00.181934 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fx7kf" event={"ID":"2271c85d-959b-40a9-aaca-4f7851b44b73","Type":"ContainerStarted","Data":"64e035c3246cbf9f96c63c76d986bff19c4497d026d961d108e61a46003ef168"} Apr 17 08:06:13.207429 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.207382 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dc775c747-rfw2g" podUID="7c280baf-dcfe-42ea-8adb-6f50a15e2311" containerName="console" containerID="cri-o://a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e" gracePeriod=15 Apr 17 08:06:13.482115 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.482092 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc775c747-rfw2g_7c280baf-dcfe-42ea-8adb-6f50a15e2311/console/0.log" Apr 17 08:06:13.482268 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.482164 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:06:13.539136 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539096 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-serving-cert\") pod \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " Apr 17 08:06:13.539136 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539143 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-oauth-serving-cert\") pod \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " Apr 17 08:06:13.539358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539184 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8hwp\" (UniqueName: \"kubernetes.io/projected/7c280baf-dcfe-42ea-8adb-6f50a15e2311-kube-api-access-n8hwp\") pod \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " Apr 17 08:06:13.539358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539202 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-service-ca\") pod \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " Apr 17 08:06:13.539358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539228 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-config\") pod \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " Apr 17 08:06:13.539358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539248 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-oauth-config\") pod \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\" (UID: \"7c280baf-dcfe-42ea-8adb-6f50a15e2311\") " Apr 17 08:06:13.539655 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539624 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-config" (OuterVolumeSpecName: "console-config") pod "7c280baf-dcfe-42ea-8adb-6f50a15e2311" (UID: "7c280baf-dcfe-42ea-8adb-6f50a15e2311"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:13.539655 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539643 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-service-ca" (OuterVolumeSpecName: "service-ca") pod "7c280baf-dcfe-42ea-8adb-6f50a15e2311" (UID: "7c280baf-dcfe-42ea-8adb-6f50a15e2311"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:13.539820 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.539629 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7c280baf-dcfe-42ea-8adb-6f50a15e2311" (UID: "7c280baf-dcfe-42ea-8adb-6f50a15e2311"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:13.541510 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.541480 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7c280baf-dcfe-42ea-8adb-6f50a15e2311" (UID: "7c280baf-dcfe-42ea-8adb-6f50a15e2311"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:13.541626 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.541603 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7c280baf-dcfe-42ea-8adb-6f50a15e2311" (UID: "7c280baf-dcfe-42ea-8adb-6f50a15e2311"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:13.541677 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.541621 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c280baf-dcfe-42ea-8adb-6f50a15e2311-kube-api-access-n8hwp" (OuterVolumeSpecName: "kube-api-access-n8hwp") pod "7c280baf-dcfe-42ea-8adb-6f50a15e2311" (UID: "7c280baf-dcfe-42ea-8adb-6f50a15e2311"). InnerVolumeSpecName "kube-api-access-n8hwp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:13.640364 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.640328 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-serving-cert\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:06:13.640364 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.640364 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-oauth-serving-cert\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:06:13.640571 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.640377 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n8hwp\" (UniqueName: \"kubernetes.io/projected/7c280baf-dcfe-42ea-8adb-6f50a15e2311-kube-api-access-n8hwp\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:06:13.640571 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.640393 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-service-ca\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:06:13.640571 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.640409 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-config\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:06:13.640571 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:13.640422 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c280baf-dcfe-42ea-8adb-6f50a15e2311-console-oauth-config\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:06:14.224370 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.224343 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc775c747-rfw2g_7c280baf-dcfe-42ea-8adb-6f50a15e2311/console/0.log" Apr 17 08:06:14.224759 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.224382 2583 generic.go:358] "Generic (PLEG): container finished" podID="7c280baf-dcfe-42ea-8adb-6f50a15e2311" containerID="a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e" exitCode=2 Apr 17 08:06:14.224759 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.224419 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc775c747-rfw2g" event={"ID":"7c280baf-dcfe-42ea-8adb-6f50a15e2311","Type":"ContainerDied","Data":"a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e"} Apr 17 08:06:14.224759 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.224443 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc775c747-rfw2g" Apr 17 08:06:14.224759 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.224462 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc775c747-rfw2g" event={"ID":"7c280baf-dcfe-42ea-8adb-6f50a15e2311","Type":"ContainerDied","Data":"385bb3e2656067a691d794c7b4576504849c20c74b877a76b560f396b50c5826"} Apr 17 08:06:14.224759 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.224480 2583 scope.go:117] "RemoveContainer" containerID="a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e" Apr 17 08:06:14.232892 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.232875 2583 scope.go:117] "RemoveContainer" containerID="a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e" Apr 17 08:06:14.233148 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:06:14.233129 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e\": container with ID starting with a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e not found: ID does not exist" containerID="a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e" Apr 17 08:06:14.233213 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.233157 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e"} err="failed to get container status \"a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e\": rpc error: code = NotFound desc = could not find container \"a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e\": container with ID starting with a9c0033b7a80c797863244beb01d2f7e06a5ca42ce70b4fff842a302b7f1787e not found: ID does not exist" Apr 17 08:06:14.243072 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.243050 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dc775c747-rfw2g"] Apr 17 08:06:14.246120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.246098 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dc775c747-rfw2g"] Apr 17 08:06:14.513690 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:06:14.513617 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c280baf-dcfe-42ea-8adb-6f50a15e2311" path="/var/lib/kubelet/pods/7c280baf-dcfe-42ea-8adb-6f50a15e2311/volumes" Apr 17 08:07:05.652466 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.652381 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bj7ml"] Apr 17 08:07:05.652922 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.652784 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c280baf-dcfe-42ea-8adb-6f50a15e2311" containerName="console" Apr 17 08:07:05.652922 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.652804 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c280baf-dcfe-42ea-8adb-6f50a15e2311" containerName="console" Apr 17 08:07:05.652922 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.652886 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c280baf-dcfe-42ea-8adb-6f50a15e2311" containerName="console" Apr 17 08:07:05.655720 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.655703 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.657590 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.657566 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 08:07:05.662513 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.662487 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bj7ml"] Apr 17 08:07:05.747073 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.747018 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-dbus\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.747246 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.747157 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-original-pull-secret\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.747246 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.747225 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-kubelet-config\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.752947 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.752917 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-857d8bfb58-ttv4p"] Apr 17 08:07:05.757378 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.757354 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.762028 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.762001 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857d8bfb58-ttv4p"] Apr 17 08:07:05.848026 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.847983 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-kubelet-config\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.848026 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848029 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-serving-cert\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.848287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848089 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-dbus\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.848287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848117 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t84g\" (UniqueName: \"kubernetes.io/projected/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-kube-api-access-2t84g\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.848287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848128 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-kubelet-config\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.848287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848147 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-service-ca\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.848287 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848225 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-oauth-config\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.848531 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848286 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-dbus\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.848531 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848308 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-original-pull-secret\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.848531 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848337 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-config\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.848531 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848354 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-oauth-serving-cert\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.848531 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.848382 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-trusted-ca-bundle\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.850592 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.850571 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/903fbe6d-9629-4e1b-b721-4fa9b0af02bd-original-pull-secret\") pod \"global-pull-secret-syncer-bj7ml\" (UID: \"903fbe6d-9629-4e1b-b721-4fa9b0af02bd\") " pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:05.948882 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948791 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-oauth-config\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.948882 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948852 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-config\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.948882 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948870 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-oauth-serving-cert\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949218 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948890 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-trusted-ca-bundle\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949218 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948921 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-serving-cert\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949218 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948942 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t84g\" (UniqueName: \"kubernetes.io/projected/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-kube-api-access-2t84g\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949218 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.948962 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-service-ca\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949677 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.949652 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-config\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949796 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.949774 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-oauth-serving-cert\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949834 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.949817 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-service-ca\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.949937 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.949919 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-trusted-ca-bundle\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.951356 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.951334 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-serving-cert\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.951434 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.951364 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-console-oauth-config\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.956139 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.956123 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t84g\" (UniqueName: \"kubernetes.io/projected/6cd77ce2-86bc-4d87-962a-71d0789bc5ae-kube-api-access-2t84g\") pod \"console-857d8bfb58-ttv4p\" (UID: \"6cd77ce2-86bc-4d87-962a-71d0789bc5ae\") " pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:05.964980 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:05.964958 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bj7ml" Apr 17 08:07:06.067655 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.067622 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:06.084211 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.084180 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bj7ml"] Apr 17 08:07:06.088909 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:07:06.088870 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod903fbe6d_9629_4e1b_b721_4fa9b0af02bd.slice/crio-4b1edde2b72762dcd9cfb64155f09fabe23e8a66e7f6c9fd3d263c18e54f6ffa WatchSource:0}: Error finding container 4b1edde2b72762dcd9cfb64155f09fabe23e8a66e7f6c9fd3d263c18e54f6ffa: Status 404 returned error can't find the container with id 4b1edde2b72762dcd9cfb64155f09fabe23e8a66e7f6c9fd3d263c18e54f6ffa Apr 17 08:07:06.189665 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.189632 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857d8bfb58-ttv4p"] Apr 17 08:07:06.192504 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:07:06.192481 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd77ce2_86bc_4d87_962a_71d0789bc5ae.slice/crio-417406f49a1bd61587e2def30ed9889500756ee7cc80f1d3da206d962e4decfb WatchSource:0}: Error finding container 417406f49a1bd61587e2def30ed9889500756ee7cc80f1d3da206d962e4decfb: Status 404 returned error can't find the container with id 417406f49a1bd61587e2def30ed9889500756ee7cc80f1d3da206d962e4decfb Apr 17 08:07:06.371296 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.371230 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bj7ml" event={"ID":"903fbe6d-9629-4e1b-b721-4fa9b0af02bd","Type":"ContainerStarted","Data":"4b1edde2b72762dcd9cfb64155f09fabe23e8a66e7f6c9fd3d263c18e54f6ffa"} Apr 17 08:07:06.372592 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.372562 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857d8bfb58-ttv4p" event={"ID":"6cd77ce2-86bc-4d87-962a-71d0789bc5ae","Type":"ContainerStarted","Data":"270000ff73c1a770fb440f1a92288f2ee9feb845e88e4002acaccca9c8f72ed9"} Apr 17 08:07:06.372709 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.372599 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857d8bfb58-ttv4p" event={"ID":"6cd77ce2-86bc-4d87-962a-71d0789bc5ae","Type":"ContainerStarted","Data":"417406f49a1bd61587e2def30ed9889500756ee7cc80f1d3da206d962e4decfb"} Apr 17 08:07:06.390269 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:06.390210 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-857d8bfb58-ttv4p" podStartSLOduration=1.390189156 podStartE2EDuration="1.390189156s" podCreationTimestamp="2026-04-17 08:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:07:06.388306985 +0000 UTC m=+264.475089669" watchObservedRunningTime="2026-04-17 08:07:06.390189156 +0000 UTC m=+264.476971841" Apr 17 08:07:10.388070 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:10.388026 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bj7ml" event={"ID":"903fbe6d-9629-4e1b-b721-4fa9b0af02bd","Type":"ContainerStarted","Data":"fdecf67dfa704023069d8967fa514200c5a390824c092286bc7794f7f7a6f8d0"} Apr 17 08:07:10.401785 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:10.401739 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bj7ml" podStartSLOduration=1.545378355 podStartE2EDuration="5.401723649s" podCreationTimestamp="2026-04-17 08:07:05 +0000 UTC" firstStartedPulling="2026-04-17 08:07:06.090714151 +0000 UTC m=+264.177496825" lastFinishedPulling="2026-04-17 08:07:09.947059444 +0000 UTC m=+268.033842119" observedRunningTime="2026-04-17 08:07:10.400960808 +0000 UTC m=+268.487743505" watchObservedRunningTime="2026-04-17 08:07:10.401723649 +0000 UTC m=+268.488506333" Apr 17 08:07:16.068127 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:16.068091 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:16.068127 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:16.068125 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:16.072820 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:16.072791 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:16.409049 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:16.409006 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-857d8bfb58-ttv4p" Apr 17 08:07:16.452034 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:16.451948 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54b66fb959-hjgwc"] Apr 17 08:07:33.156336 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.156300 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz"] Apr 17 08:07:33.159513 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.159492 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.161582 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.161558 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 08:07:33.162028 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.162013 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jxx47\"" Apr 17 08:07:33.162028 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.162020 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 08:07:33.167375 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.167348 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz"] Apr 17 08:07:33.289784 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.289748 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.289956 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.289789 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmm8\" (UniqueName: \"kubernetes.io/projected/d1045459-8457-44e7-beff-c79ebd37da50-kube-api-access-5zmm8\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.289956 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.289862 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.390313 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.390267 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.390313 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.390320 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmm8\" (UniqueName: \"kubernetes.io/projected/d1045459-8457-44e7-beff-c79ebd37da50-kube-api-access-5zmm8\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.390505 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.390345 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.390652 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.390631 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.390710 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.390655 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.397823 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.397795 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmm8\" (UniqueName: \"kubernetes.io/projected/d1045459-8457-44e7-beff-c79ebd37da50-kube-api-access-5zmm8\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.470071 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.469968 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:33.587126 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:33.586954 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz"] Apr 17 08:07:33.589908 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:07:33.589878 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1045459_8457_44e7_beff_c79ebd37da50.slice/crio-6838624bd12402ff926ed7fc92b9ca525021dea9fa7da10c6b03faba60736859 WatchSource:0}: Error finding container 6838624bd12402ff926ed7fc92b9ca525021dea9fa7da10c6b03faba60736859: Status 404 returned error can't find the container with id 6838624bd12402ff926ed7fc92b9ca525021dea9fa7da10c6b03faba60736859 Apr 17 08:07:34.462230 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:34.462192 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerStarted","Data":"6838624bd12402ff926ed7fc92b9ca525021dea9fa7da10c6b03faba60736859"} Apr 17 08:07:41.478454 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.478415 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54b66fb959-hjgwc" podUID="a5898bf4-855b-4f8e-8d90-6d79b1582535" containerName="console" containerID="cri-o://4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641" gracePeriod=15 Apr 17 08:07:41.484861 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.484827 2583 generic.go:358] "Generic (PLEG): container finished" podID="d1045459-8457-44e7-beff-c79ebd37da50" containerID="9c6cc5341ca922864156007906c1b211ef7034a4cfb1512f6e81ca1a72807002" exitCode=0 Apr 17 08:07:41.484985 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.484902 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerDied","Data":"9c6cc5341ca922864156007906c1b211ef7034a4cfb1512f6e81ca1a72807002"} Apr 17 08:07:41.716711 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.716689 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54b66fb959-hjgwc_a5898bf4-855b-4f8e-8d90-6d79b1582535/console/0.log" Apr 17 08:07:41.716847 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.716752 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:07:41.864789 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864696 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-oauth-serving-cert\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.864789 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864738 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-serving-cert\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.864789 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864766 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-service-ca\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.864789 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864786 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-trusted-ca-bundle\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.865102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864812 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-222jl\" (UniqueName: \"kubernetes.io/projected/a5898bf4-855b-4f8e-8d90-6d79b1582535-kube-api-access-222jl\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.865102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864829 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-config\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.865102 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.864887 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-oauth-config\") pod \"a5898bf4-855b-4f8e-8d90-6d79b1582535\" (UID: \"a5898bf4-855b-4f8e-8d90-6d79b1582535\") " Apr 17 08:07:41.865401 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865367 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-service-ca" (OuterVolumeSpecName: "service-ca") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:07:41.865590 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865543 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:07:41.865689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865631 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-config" (OuterVolumeSpecName: "console-config") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:07:41.865689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865660 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:07:41.866948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865894 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-oauth-serving-cert\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:41.866948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865926 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-service-ca\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:41.866948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865942 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-trusted-ca-bundle\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:41.866948 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.865958 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-config\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:41.871582 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.871304 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:07:41.871862 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.871837 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:07:41.871930 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.871840 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5898bf4-855b-4f8e-8d90-6d79b1582535-kube-api-access-222jl" (OuterVolumeSpecName: "kube-api-access-222jl") pod "a5898bf4-855b-4f8e-8d90-6d79b1582535" (UID: "a5898bf4-855b-4f8e-8d90-6d79b1582535"). InnerVolumeSpecName "kube-api-access-222jl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:07:41.966383 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.966341 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-oauth-config\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:41.966383 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.966377 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5898bf4-855b-4f8e-8d90-6d79b1582535-console-serving-cert\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:41.966383 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:41.966388 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-222jl\" (UniqueName: \"kubernetes.io/projected/a5898bf4-855b-4f8e-8d90-6d79b1582535-kube-api-access-222jl\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:42.385874 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.385847 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54b66fb959-hjgwc_a5898bf4-855b-4f8e-8d90-6d79b1582535/console/0.log" Apr 17 08:07:42.387503 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.387477 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54b66fb959-hjgwc_a5898bf4-855b-4f8e-8d90-6d79b1582535/console/0.log" Apr 17 08:07:42.393795 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.393744 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/2.log" Apr 17 08:07:42.395061 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.395024 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/2.log" Apr 17 08:07:42.397592 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.397568 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:07:42.398422 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.398401 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:07:42.405626 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.405606 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 08:07:42.488874 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.488706 2583 generic.go:358] "Generic (PLEG): container finished" podID="a5898bf4-855b-4f8e-8d90-6d79b1582535" containerID="4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641" exitCode=2 Apr 17 08:07:42.488874 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.488744 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b66fb959-hjgwc" event={"ID":"a5898bf4-855b-4f8e-8d90-6d79b1582535","Type":"ContainerDied","Data":"4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641"} Apr 17 08:07:42.488874 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.488792 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54b66fb959-hjgwc" event={"ID":"a5898bf4-855b-4f8e-8d90-6d79b1582535","Type":"ContainerDied","Data":"f311209d621945eff6001f7a1342ae8801101a77de0c70256be5a364fff4ca03"} Apr 17 08:07:42.488874 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.488797 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54b66fb959-hjgwc" Apr 17 08:07:42.488874 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.488814 2583 scope.go:117] "RemoveContainer" containerID="4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641" Apr 17 08:07:42.501010 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.500965 2583 scope.go:117] "RemoveContainer" containerID="4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641" Apr 17 08:07:42.501247 ip-10-0-138-54 kubenswrapper[2583]: E0417 08:07:42.501210 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641\": container with ID starting with 4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641 not found: ID does not exist" containerID="4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641" Apr 17 08:07:42.501344 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.501253 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641"} err="failed to get container status \"4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641\": rpc error: code = NotFound desc = could not find container \"4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641\": container with ID starting with 4b84576a2c9fd56832143c4ebd3c14c0d7b0c56e08c61e5451d84bba43828641 not found: ID does not exist" Apr 17 08:07:42.515326 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.515301 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54b66fb959-hjgwc"] Apr 17 08:07:42.518979 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:42.518952 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54b66fb959-hjgwc"] Apr 17 08:07:43.494054 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:43.494002 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerStarted","Data":"0b3de79a803533ac31adefbe21f32287516d594b2870dfafa7d70444b57c5053"} Apr 17 08:07:43.494935 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:43.494919 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:07:44.499274 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:44.499239 2583 generic.go:358] "Generic (PLEG): container finished" podID="d1045459-8457-44e7-beff-c79ebd37da50" containerID="0b3de79a803533ac31adefbe21f32287516d594b2870dfafa7d70444b57c5053" exitCode=0 Apr 17 08:07:44.499646 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:44.499319 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerDied","Data":"0b3de79a803533ac31adefbe21f32287516d594b2870dfafa7d70444b57c5053"} Apr 17 08:07:44.514004 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:44.513975 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5898bf4-855b-4f8e-8d90-6d79b1582535" path="/var/lib/kubelet/pods/a5898bf4-855b-4f8e-8d90-6d79b1582535/volumes" Apr 17 08:07:51.523932 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:51.523876 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerStarted","Data":"51838d368e355e0f0e80265267c46c607d871e86ce46977366a4a7ce0f7a50de"} Apr 17 08:07:51.538229 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:51.538170 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" podStartSLOduration=0.679853485 podStartE2EDuration="18.538151397s" podCreationTimestamp="2026-04-17 08:07:33 +0000 UTC" firstStartedPulling="2026-04-17 08:07:33.592136107 +0000 UTC m=+291.678918774" lastFinishedPulling="2026-04-17 08:07:51.450434023 +0000 UTC m=+309.537216686" observedRunningTime="2026-04-17 08:07:51.537061664 +0000 UTC m=+309.623844347" watchObservedRunningTime="2026-04-17 08:07:51.538151397 +0000 UTC m=+309.624934084" Apr 17 08:07:52.527591 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:52.527547 2583 generic.go:358] "Generic (PLEG): container finished" podID="d1045459-8457-44e7-beff-c79ebd37da50" containerID="51838d368e355e0f0e80265267c46c607d871e86ce46977366a4a7ce0f7a50de" exitCode=0 Apr 17 08:07:52.527974 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:52.527608 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerDied","Data":"51838d368e355e0f0e80265267c46c607d871e86ce46977366a4a7ce0f7a50de"} Apr 17 08:07:53.652274 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.652250 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:07:53.755294 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.755260 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zmm8\" (UniqueName: \"kubernetes.io/projected/d1045459-8457-44e7-beff-c79ebd37da50-kube-api-access-5zmm8\") pod \"d1045459-8457-44e7-beff-c79ebd37da50\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " Apr 17 08:07:53.755459 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.755313 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-util\") pod \"d1045459-8457-44e7-beff-c79ebd37da50\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " Apr 17 08:07:53.755459 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.755351 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-bundle\") pod \"d1045459-8457-44e7-beff-c79ebd37da50\" (UID: \"d1045459-8457-44e7-beff-c79ebd37da50\") " Apr 17 08:07:53.755984 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.755955 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-bundle" (OuterVolumeSpecName: "bundle") pod "d1045459-8457-44e7-beff-c79ebd37da50" (UID: "d1045459-8457-44e7-beff-c79ebd37da50"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:53.757473 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.757448 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1045459-8457-44e7-beff-c79ebd37da50-kube-api-access-5zmm8" (OuterVolumeSpecName: "kube-api-access-5zmm8") pod "d1045459-8457-44e7-beff-c79ebd37da50" (UID: "d1045459-8457-44e7-beff-c79ebd37da50"). InnerVolumeSpecName "kube-api-access-5zmm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:07:53.759340 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.759312 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-util" (OuterVolumeSpecName: "util") pod "d1045459-8457-44e7-beff-c79ebd37da50" (UID: "d1045459-8457-44e7-beff-c79ebd37da50"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:07:53.855981 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.855891 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-bundle\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:53.855981 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.855925 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zmm8\" (UniqueName: \"kubernetes.io/projected/d1045459-8457-44e7-beff-c79ebd37da50-kube-api-access-5zmm8\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:53.855981 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:53.855934 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1045459-8457-44e7-beff-c79ebd37da50-util\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:07:54.533958 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:54.533924 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" event={"ID":"d1045459-8457-44e7-beff-c79ebd37da50","Type":"ContainerDied","Data":"6838624bd12402ff926ed7fc92b9ca525021dea9fa7da10c6b03faba60736859"} Apr 17 08:07:54.533958 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:54.533957 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6838624bd12402ff926ed7fc92b9ca525021dea9fa7da10c6b03faba60736859" Apr 17 08:07:54.534151 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:07:54.533962 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5w29bz" Apr 17 08:08:00.486883 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.486851 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh"] Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487159 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="pull" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487171 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="pull" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487187 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="extract" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487192 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="extract" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487199 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5898bf4-855b-4f8e-8d90-6d79b1582535" containerName="console" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487207 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5898bf4-855b-4f8e-8d90-6d79b1582535" containerName="console" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487225 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="util" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487230 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="util" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487275 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1045459-8457-44e7-beff-c79ebd37da50" containerName="extract" Apr 17 08:08:00.487288 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.487285 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5898bf4-855b-4f8e-8d90-6d79b1582535" containerName="console" Apr 17 08:08:00.489118 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.489101 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.490907 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.490877 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 08:08:00.491029 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.491018 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:08:00.491230 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.491211 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-cll2s\"" Apr 17 08:08:00.501806 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.501786 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh"] Apr 17 08:08:00.607954 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.607912 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zchw\" (UniqueName: \"kubernetes.io/projected/8b6c0d73-40e3-420d-a535-36fb512a1dbe-kube-api-access-5zchw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-2g5vh\" (UID: \"8b6c0d73-40e3-420d-a535-36fb512a1dbe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.608185 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.608022 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b6c0d73-40e3-420d-a535-36fb512a1dbe-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-2g5vh\" (UID: \"8b6c0d73-40e3-420d-a535-36fb512a1dbe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.708821 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.708781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zchw\" (UniqueName: \"kubernetes.io/projected/8b6c0d73-40e3-420d-a535-36fb512a1dbe-kube-api-access-5zchw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-2g5vh\" (UID: \"8b6c0d73-40e3-420d-a535-36fb512a1dbe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.708821 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.708825 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b6c0d73-40e3-420d-a535-36fb512a1dbe-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-2g5vh\" (UID: \"8b6c0d73-40e3-420d-a535-36fb512a1dbe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.709248 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.709230 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b6c0d73-40e3-420d-a535-36fb512a1dbe-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-2g5vh\" (UID: \"8b6c0d73-40e3-420d-a535-36fb512a1dbe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.716172 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.716149 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zchw\" (UniqueName: \"kubernetes.io/projected/8b6c0d73-40e3-420d-a535-36fb512a1dbe-kube-api-access-5zchw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-2g5vh\" (UID: \"8b6c0d73-40e3-420d-a535-36fb512a1dbe\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.798268 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.798190 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" Apr 17 08:08:00.924758 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:00.924673 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh"] Apr 17 08:08:00.927786 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:08:00.927758 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b6c0d73_40e3_420d_a535_36fb512a1dbe.slice/crio-9fd8053cc9475590057d2fc86abb137004b60b0b63655bbd55e7c0b218919c20 WatchSource:0}: Error finding container 9fd8053cc9475590057d2fc86abb137004b60b0b63655bbd55e7c0b218919c20: Status 404 returned error can't find the container with id 9fd8053cc9475590057d2fc86abb137004b60b0b63655bbd55e7c0b218919c20 Apr 17 08:08:01.559392 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:01.559341 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" event={"ID":"8b6c0d73-40e3-420d-a535-36fb512a1dbe","Type":"ContainerStarted","Data":"9fd8053cc9475590057d2fc86abb137004b60b0b63655bbd55e7c0b218919c20"} Apr 17 08:08:03.568553 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:03.568513 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" event={"ID":"8b6c0d73-40e3-420d-a535-36fb512a1dbe","Type":"ContainerStarted","Data":"7eaaf1c8d2e742d862aa5cb197c0d60eae33fd03451f79f7edc04caa6c9390ee"} Apr 17 08:08:03.585518 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:03.585471 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-2g5vh" podStartSLOduration=1.907652774 podStartE2EDuration="3.5854579s" podCreationTimestamp="2026-04-17 08:08:00 +0000 UTC" firstStartedPulling="2026-04-17 08:08:00.930025255 +0000 UTC m=+319.016807935" lastFinishedPulling="2026-04-17 08:08:02.607830383 +0000 UTC m=+320.694613061" observedRunningTime="2026-04-17 08:08:03.583959441 +0000 UTC m=+321.670742125" watchObservedRunningTime="2026-04-17 08:08:03.5854579 +0000 UTC m=+321.672240583" Apr 17 08:08:08.787120 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.787085 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xr76p"] Apr 17 08:08:08.789606 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.789591 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:08.791669 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.791646 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 08:08:08.791798 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.791705 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 08:08:08.792168 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.792149 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-jwz9t\"" Apr 17 08:08:08.799073 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.799032 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xr76p"] Apr 17 08:08:08.876390 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.876356 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbrxh\" (UniqueName: \"kubernetes.io/projected/2f7dd4d5-7a30-45f4-908e-2e2fcd22b592-kube-api-access-jbrxh\") pod \"cert-manager-webhook-597b96b99b-xr76p\" (UID: \"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:08.876569 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.876416 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f7dd4d5-7a30-45f4-908e-2e2fcd22b592-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xr76p\" (UID: \"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:08.977689 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.977647 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbrxh\" (UniqueName: \"kubernetes.io/projected/2f7dd4d5-7a30-45f4-908e-2e2fcd22b592-kube-api-access-jbrxh\") pod \"cert-manager-webhook-597b96b99b-xr76p\" (UID: \"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:08.977842 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.977723 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f7dd4d5-7a30-45f4-908e-2e2fcd22b592-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xr76p\" (UID: \"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:08.988197 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.988172 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f7dd4d5-7a30-45f4-908e-2e2fcd22b592-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xr76p\" (UID: \"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:08.988328 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:08.988251 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbrxh\" (UniqueName: \"kubernetes.io/projected/2f7dd4d5-7a30-45f4-908e-2e2fcd22b592-kube-api-access-jbrxh\") pod \"cert-manager-webhook-597b96b99b-xr76p\" (UID: \"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:09.099055 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:09.098960 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:09.217586 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:09.217559 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xr76p"] Apr 17 08:08:09.220387 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:08:09.220360 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f7dd4d5_7a30_45f4_908e_2e2fcd22b592.slice/crio-974c16f6fc98bc95b89928c7392e27ca5baccc3eca63ba1ea5a1c24f8abef37b WatchSource:0}: Error finding container 974c16f6fc98bc95b89928c7392e27ca5baccc3eca63ba1ea5a1c24f8abef37b: Status 404 returned error can't find the container with id 974c16f6fc98bc95b89928c7392e27ca5baccc3eca63ba1ea5a1c24f8abef37b Apr 17 08:08:09.590852 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:09.590818 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" event={"ID":"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592","Type":"ContainerStarted","Data":"974c16f6fc98bc95b89928c7392e27ca5baccc3eca63ba1ea5a1c24f8abef37b"} Apr 17 08:08:12.603170 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:12.603131 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" event={"ID":"2f7dd4d5-7a30-45f4-908e-2e2fcd22b592","Type":"ContainerStarted","Data":"8cfd23c0a7ae3c6e72c9a78b1e2458d1b26f3588cbee356e2bffcc70304e4d85"} Apr 17 08:08:12.603170 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:12.603175 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:12.617447 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:12.617394 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" podStartSLOduration=1.9197897579999998 podStartE2EDuration="4.617376305s" podCreationTimestamp="2026-04-17 08:08:08 +0000 UTC" firstStartedPulling="2026-04-17 08:08:09.22224141 +0000 UTC m=+327.309024073" lastFinishedPulling="2026-04-17 08:08:11.919827956 +0000 UTC m=+330.006610620" observedRunningTime="2026-04-17 08:08:12.615929818 +0000 UTC m=+330.702712503" watchObservedRunningTime="2026-04-17 08:08:12.617376305 +0000 UTC m=+330.704158992" Apr 17 08:08:15.270566 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.270530 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-nccjv"] Apr 17 08:08:15.314839 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.314801 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nccjv"] Apr 17 08:08:15.315080 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.314913 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.317445 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.317424 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-66gr8\"" Apr 17 08:08:15.431167 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.431124 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsq89\" (UniqueName: \"kubernetes.io/projected/85060616-501b-4559-8f5a-1b74b7a3a530-kube-api-access-rsq89\") pod \"cert-manager-759f64656b-nccjv\" (UID: \"85060616-501b-4559-8f5a-1b74b7a3a530\") " pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.431335 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.431185 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85060616-501b-4559-8f5a-1b74b7a3a530-bound-sa-token\") pod \"cert-manager-759f64656b-nccjv\" (UID: \"85060616-501b-4559-8f5a-1b74b7a3a530\") " pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.532637 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.532546 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsq89\" (UniqueName: \"kubernetes.io/projected/85060616-501b-4559-8f5a-1b74b7a3a530-kube-api-access-rsq89\") pod \"cert-manager-759f64656b-nccjv\" (UID: \"85060616-501b-4559-8f5a-1b74b7a3a530\") " pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.532637 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.532597 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85060616-501b-4559-8f5a-1b74b7a3a530-bound-sa-token\") pod \"cert-manager-759f64656b-nccjv\" (UID: \"85060616-501b-4559-8f5a-1b74b7a3a530\") " pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.539812 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.539781 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsq89\" (UniqueName: \"kubernetes.io/projected/85060616-501b-4559-8f5a-1b74b7a3a530-kube-api-access-rsq89\") pod \"cert-manager-759f64656b-nccjv\" (UID: \"85060616-501b-4559-8f5a-1b74b7a3a530\") " pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.539956 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.539784 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85060616-501b-4559-8f5a-1b74b7a3a530-bound-sa-token\") pod \"cert-manager-759f64656b-nccjv\" (UID: \"85060616-501b-4559-8f5a-1b74b7a3a530\") " pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.624280 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.624251 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nccjv" Apr 17 08:08:15.740190 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:15.740159 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nccjv"] Apr 17 08:08:15.743502 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:08:15.743474 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85060616_501b_4559_8f5a_1b74b7a3a530.slice/crio-b025e5f094fca7257bf3a94e0cd33ddbdf958ad7409fa5e3409985d9d7731ba8 WatchSource:0}: Error finding container b025e5f094fca7257bf3a94e0cd33ddbdf958ad7409fa5e3409985d9d7731ba8: Status 404 returned error can't find the container with id b025e5f094fca7257bf3a94e0cd33ddbdf958ad7409fa5e3409985d9d7731ba8 Apr 17 08:08:16.615959 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:16.615916 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nccjv" event={"ID":"85060616-501b-4559-8f5a-1b74b7a3a530","Type":"ContainerStarted","Data":"1fb5b92177d2f33284ae4f5fe1b7b775d860124b65f99f30f76606314311fb62"} Apr 17 08:08:16.615959 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:16.615964 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nccjv" event={"ID":"85060616-501b-4559-8f5a-1b74b7a3a530","Type":"ContainerStarted","Data":"b025e5f094fca7257bf3a94e0cd33ddbdf958ad7409fa5e3409985d9d7731ba8"} Apr 17 08:08:17.735193 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.735138 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-nccjv" podStartSLOduration=2.735119567 podStartE2EDuration="2.735119567s" podCreationTimestamp="2026-04-17 08:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:08:16.628712214 +0000 UTC m=+334.715494912" watchObservedRunningTime="2026-04-17 08:08:17.735119567 +0000 UTC m=+335.821902252" Apr 17 08:08:17.736571 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.736548 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk"] Apr 17 08:08:17.739477 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.739463 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.741377 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.741357 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 08:08:17.741446 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.741357 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 08:08:17.741744 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.741727 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-jxx47\"" Apr 17 08:08:17.746555 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.746534 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk"] Apr 17 08:08:17.849898 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.849849 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmhg\" (UniqueName: \"kubernetes.io/projected/ab77fcd1-5f75-43bb-800b-cd44365f0aff-kube-api-access-kzmhg\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.849898 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.849900 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.850159 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.849967 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.951159 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.951119 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmhg\" (UniqueName: \"kubernetes.io/projected/ab77fcd1-5f75-43bb-800b-cd44365f0aff-kube-api-access-kzmhg\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.951159 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.951160 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.951413 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.951189 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.951586 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.951564 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.951586 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.951582 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:17.958175 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:17.958147 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmhg\" (UniqueName: \"kubernetes.io/projected/ab77fcd1-5f75-43bb-800b-cd44365f0aff-kube-api-access-kzmhg\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:18.049590 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:18.049505 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:18.169190 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:18.169082 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk"] Apr 17 08:08:18.171610 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:08:18.171585 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab77fcd1_5f75_43bb_800b_cd44365f0aff.slice/crio-0b3bed0d8c78df9b6dd838cac2d00b9792182ca337d89a95cefb14dc83daad08 WatchSource:0}: Error finding container 0b3bed0d8c78df9b6dd838cac2d00b9792182ca337d89a95cefb14dc83daad08: Status 404 returned error can't find the container with id 0b3bed0d8c78df9b6dd838cac2d00b9792182ca337d89a95cefb14dc83daad08 Apr 17 08:08:18.608488 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:18.608457 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xr76p" Apr 17 08:08:18.624572 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:18.624538 2583 generic.go:358] "Generic (PLEG): container finished" podID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerID="d10930159a4c544da8b140604220162a1ab64a00fa7fd65944d39e9124fdfcad" exitCode=0 Apr 17 08:08:18.624738 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:18.624596 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" event={"ID":"ab77fcd1-5f75-43bb-800b-cd44365f0aff","Type":"ContainerDied","Data":"d10930159a4c544da8b140604220162a1ab64a00fa7fd65944d39e9124fdfcad"} Apr 17 08:08:18.624738 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:18.624616 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" event={"ID":"ab77fcd1-5f75-43bb-800b-cd44365f0aff","Type":"ContainerStarted","Data":"0b3bed0d8c78df9b6dd838cac2d00b9792182ca337d89a95cefb14dc83daad08"} Apr 17 08:08:21.639253 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:21.639220 2583 generic.go:358] "Generic (PLEG): container finished" podID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerID="2bd914d11c64381442c1bf7068a27d1a571e2b56865ca5166fe63de5393949ad" exitCode=0 Apr 17 08:08:21.639640 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:21.639298 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" event={"ID":"ab77fcd1-5f75-43bb-800b-cd44365f0aff","Type":"ContainerDied","Data":"2bd914d11c64381442c1bf7068a27d1a571e2b56865ca5166fe63de5393949ad"} Apr 17 08:08:22.644469 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:22.644438 2583 generic.go:358] "Generic (PLEG): container finished" podID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerID="a33f54831a0dbd67539ea1717ad46f0c61b086d30c2ebd1c0f2fa861706151ea" exitCode=0 Apr 17 08:08:22.644847 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:22.644512 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" event={"ID":"ab77fcd1-5f75-43bb-800b-cd44365f0aff","Type":"ContainerDied","Data":"a33f54831a0dbd67539ea1717ad46f0c61b086d30c2ebd1c0f2fa861706151ea"} Apr 17 08:08:23.770312 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:23.770287 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:23.899780 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:23.899680 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-util\") pod \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " Apr 17 08:08:23.899780 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:23.899737 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzmhg\" (UniqueName: \"kubernetes.io/projected/ab77fcd1-5f75-43bb-800b-cd44365f0aff-kube-api-access-kzmhg\") pod \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " Apr 17 08:08:23.899780 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:23.899767 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-bundle\") pod \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\" (UID: \"ab77fcd1-5f75-43bb-800b-cd44365f0aff\") " Apr 17 08:08:23.900220 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:23.900194 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-bundle" (OuterVolumeSpecName: "bundle") pod "ab77fcd1-5f75-43bb-800b-cd44365f0aff" (UID: "ab77fcd1-5f75-43bb-800b-cd44365f0aff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:08:23.901912 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:23.901884 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab77fcd1-5f75-43bb-800b-cd44365f0aff-kube-api-access-kzmhg" (OuterVolumeSpecName: "kube-api-access-kzmhg") pod "ab77fcd1-5f75-43bb-800b-cd44365f0aff" (UID: "ab77fcd1-5f75-43bb-800b-cd44365f0aff"). InnerVolumeSpecName "kube-api-access-kzmhg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:08:24.000429 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.000394 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kzmhg\" (UniqueName: \"kubernetes.io/projected/ab77fcd1-5f75-43bb-800b-cd44365f0aff-kube-api-access-kzmhg\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:08:24.000429 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.000427 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-bundle\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:08:24.257696 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.257609 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-util" (OuterVolumeSpecName: "util") pod "ab77fcd1-5f75-43bb-800b-cd44365f0aff" (UID: "ab77fcd1-5f75-43bb-800b-cd44365f0aff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:08:24.302394 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.302361 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab77fcd1-5f75-43bb-800b-cd44365f0aff-util\") on node \"ip-10-0-138-54.ec2.internal\" DevicePath \"\"" Apr 17 08:08:24.652300 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.652249 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" event={"ID":"ab77fcd1-5f75-43bb-800b-cd44365f0aff","Type":"ContainerDied","Data":"0b3bed0d8c78df9b6dd838cac2d00b9792182ca337d89a95cefb14dc83daad08"} Apr 17 08:08:24.652300 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.652283 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e58djk" Apr 17 08:08:24.652300 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:24.652294 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3bed0d8c78df9b6dd838cac2d00b9792182ca337d89a95cefb14dc83daad08" Apr 17 08:08:43.953266 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953231 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6"] Apr 17 08:08:43.953641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953552 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="pull" Apr 17 08:08:43.953641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953563 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="pull" Apr 17 08:08:43.953641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953584 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="util" Apr 17 08:08:43.953641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953589 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="util" Apr 17 08:08:43.953641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953598 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="extract" Apr 17 08:08:43.953641 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953604 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="extract" Apr 17 08:08:43.953825 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.953677 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab77fcd1-5f75-43bb-800b-cd44365f0aff" containerName="extract" Apr 17 08:08:43.958468 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.958452 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:43.961056 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.961012 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 08:08:43.961178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.961125 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 08:08:43.961178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.961137 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:08:43.961178 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.961167 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 08:08:43.961340 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.961168 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-zdwwn\"" Apr 17 08:08:43.961586 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.961535 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 08:08:43.963529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:43.963508 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6"] Apr 17 08:08:44.065392 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.065350 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-metrics-certs\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.065392 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.065393 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-manager-config\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.065610 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.065470 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-cert\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.065610 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.065543 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75b9r\" (UniqueName: \"kubernetes.io/projected/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-kube-api-access-75b9r\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.166655 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.166617 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-cert\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.166854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.166671 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75b9r\" (UniqueName: \"kubernetes.io/projected/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-kube-api-access-75b9r\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.166854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.166716 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-metrics-certs\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.166854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.166737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-manager-config\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.167477 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.167456 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-manager-config\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.169269 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.169242 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-metrics-certs\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.169400 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.169379 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-cert\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.173981 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.173956 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75b9r\" (UniqueName: \"kubernetes.io/projected/4bcb351d-bf18-4f47-9cc0-5e266e2073ce-kube-api-access-75b9r\") pod \"jobset-controller-manager-74d99cf557-vs8p6\" (UID: \"4bcb351d-bf18-4f47-9cc0-5e266e2073ce\") " pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.268905 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.268816 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:44.389461 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.389430 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6"] Apr 17 08:08:44.392400 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:08:44.392371 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bcb351d_bf18_4f47_9cc0_5e266e2073ce.slice/crio-f2ee213944db168fa60c6b5b8b25e2407662f7afda73fbfa07fae6f5c3725294 WatchSource:0}: Error finding container f2ee213944db168fa60c6b5b8b25e2407662f7afda73fbfa07fae6f5c3725294: Status 404 returned error can't find the container with id f2ee213944db168fa60c6b5b8b25e2407662f7afda73fbfa07fae6f5c3725294 Apr 17 08:08:44.718054 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:44.718008 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" event={"ID":"4bcb351d-bf18-4f47-9cc0-5e266e2073ce","Type":"ContainerStarted","Data":"f2ee213944db168fa60c6b5b8b25e2407662f7afda73fbfa07fae6f5c3725294"} Apr 17 08:08:46.726355 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:46.726324 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" event={"ID":"4bcb351d-bf18-4f47-9cc0-5e266e2073ce","Type":"ContainerStarted","Data":"74a3d64e519dde8f15eac9c6dfe22f0244fad1305301967800258b60f06a0225"} Apr 17 08:08:46.726733 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:46.726439 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:08:46.741902 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:46.741537 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" podStartSLOduration=1.542267788 podStartE2EDuration="3.741519593s" podCreationTimestamp="2026-04-17 08:08:43 +0000 UTC" firstStartedPulling="2026-04-17 08:08:44.394834255 +0000 UTC m=+362.481616923" lastFinishedPulling="2026-04-17 08:08:46.594086065 +0000 UTC m=+364.680868728" observedRunningTime="2026-04-17 08:08:46.739305463 +0000 UTC m=+364.826088147" watchObservedRunningTime="2026-04-17 08:08:46.741519593 +0000 UTC m=+364.828302279" Apr 17 08:08:57.741091 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:08:57.741034 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-74d99cf557-vs8p6" Apr 17 08:10:16.605655 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:16.605584 2583 ???:1] "http: TLS handshake error from 10.0.128.245:41404: EOF" Apr 17 08:10:16.607732 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:16.607672 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bj7ml_903fbe6d-9629-4e1b-b721-4fa9b0af02bd/global-pull-secret-syncer/0.log" Apr 17 08:10:16.671973 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:16.671943 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-sptg6_71e69b48-555a-432e-b12a-16dae2f11d3c/konnectivity-agent/0.log" Apr 17 08:10:16.742741 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:16.742711 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-54.ec2.internal_db15b079396bd5a2463864d9d840da9c/haproxy/0.log" Apr 17 08:10:20.029854 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.029822 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-v85bk_531f37d5-f8e4-4699-9a6c-ba7867ad1c9b/monitoring-plugin/0.log" Apr 17 08:10:20.141485 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.141439 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dt96l_2d9d6a08-d2cb-498a-8d4b-777a008e3ef8/node-exporter/0.log" Apr 17 08:10:20.163258 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.163182 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dt96l_2d9d6a08-d2cb-498a-8d4b-777a008e3ef8/kube-rbac-proxy/0.log" Apr 17 08:10:20.188547 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.188521 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dt96l_2d9d6a08-d2cb-498a-8d4b-777a008e3ef8/init-textfile/0.log" Apr 17 08:10:20.548719 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.548634 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-8hrrt_e7e00398-1ae3-4494-af8f-0ca517c6cec3/prometheus-operator/0.log" Apr 17 08:10:20.568516 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.568490 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-8hrrt_e7e00398-1ae3-4494-af8f-0ca517c6cec3/kube-rbac-proxy/0.log" Apr 17 08:10:20.704398 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.704368 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74455479b4-qx2xr_9b420cbb-74d5-4ac1-9919-a46aedbcd9c7/thanos-query/0.log" Apr 17 08:10:20.724863 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.724837 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74455479b4-qx2xr_9b420cbb-74d5-4ac1-9919-a46aedbcd9c7/kube-rbac-proxy-web/0.log" Apr 17 08:10:20.745920 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.745892 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74455479b4-qx2xr_9b420cbb-74d5-4ac1-9919-a46aedbcd9c7/kube-rbac-proxy/0.log" Apr 17 08:10:20.770149 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.770117 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74455479b4-qx2xr_9b420cbb-74d5-4ac1-9919-a46aedbcd9c7/prom-label-proxy/0.log" Apr 17 08:10:20.790894 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.790852 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74455479b4-qx2xr_9b420cbb-74d5-4ac1-9919-a46aedbcd9c7/kube-rbac-proxy-rules/0.log" Apr 17 08:10:20.813851 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:20.813776 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-74455479b4-qx2xr_9b420cbb-74d5-4ac1-9919-a46aedbcd9c7/kube-rbac-proxy-metrics/0.log" Apr 17 08:10:22.389105 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:22.389074 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/2.log" Apr 17 08:10:22.393547 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:22.393525 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mnhd4_7881d774-cc8e-4aa7-852b-0ef081cb8318/console-operator/3.log" Apr 17 08:10:22.737614 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:22.737539 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857d8bfb58-ttv4p_6cd77ce2-86bc-4d87-962a-71d0789bc5ae/console/0.log" Apr 17 08:10:22.765368 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:22.765340 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8pfw7_c704e155-2e47-4095-b42d-9a554fe2e495/download-server/0.log" Apr 17 08:10:23.448630 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.448600 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk"] Apr 17 08:10:23.450687 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.450671 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.452393 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.452369 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-btznw\"/\"openshift-service-ca.crt\"" Apr 17 08:10:23.452491 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.452422 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-btznw\"/\"default-dockercfg-z96wr\"" Apr 17 08:10:23.452889 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.452874 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-btznw\"/\"kube-root-ca.crt\"" Apr 17 08:10:23.461714 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.461665 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk"] Apr 17 08:10:23.533303 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.533252 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-sys\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.533303 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.533305 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-podres\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.533303 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.533324 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdm9k\" (UniqueName: \"kubernetes.io/projected/ba2e3480-98ea-40ea-ba93-491e54464e67-kube-api-access-rdm9k\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.533631 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.533394 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-proc\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.533631 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.533412 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-lib-modules\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634029 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.633992 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-proc\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634029 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634031 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-lib-modules\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634238 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-sys\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634238 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634119 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-proc\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634238 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634132 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-podres\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634238 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634208 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-podres\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634238 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634208 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-sys\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634391 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634268 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdm9k\" (UniqueName: \"kubernetes.io/projected/ba2e3480-98ea-40ea-ba93-491e54464e67-kube-api-access-rdm9k\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.634391 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.634295 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba2e3480-98ea-40ea-ba93-491e54464e67-lib-modules\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.640972 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.640944 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdm9k\" (UniqueName: \"kubernetes.io/projected/ba2e3480-98ea-40ea-ba93-491e54464e67-kube-api-access-rdm9k\") pod \"perf-node-gather-daemonset-x2pdk\" (UID: \"ba2e3480-98ea-40ea-ba93-491e54464e67\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.761970 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.761887 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:23.810289 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.809874 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-65v7n_d198e1b7-2f54-4936-a0be-c6a8e9e20ea7/dns/0.log" Apr 17 08:10:23.831168 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.831117 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-65v7n_d198e1b7-2f54-4936-a0be-c6a8e9e20ea7/kube-rbac-proxy/0.log" Apr 17 08:10:23.886336 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.886312 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk"] Apr 17 08:10:23.889188 ip-10-0-138-54 kubenswrapper[2583]: W0417 08:10:23.889156 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podba2e3480_98ea_40ea_ba93_491e54464e67.slice/crio-7734017d17651c3a0fc9caf79ff220261765cec8e516caa69d2bf06a1afb39b2 WatchSource:0}: Error finding container 7734017d17651c3a0fc9caf79ff220261765cec8e516caa69d2bf06a1afb39b2: Status 404 returned error can't find the container with id 7734017d17651c3a0fc9caf79ff220261765cec8e516caa69d2bf06a1afb39b2 Apr 17 08:10:23.931196 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:23.931166 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q5zfz_ad1b5fde-cf28-4a54-bb33-cb43f425421e/dns-node-resolver/0.log" Apr 17 08:10:24.057322 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:24.057232 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" event={"ID":"ba2e3480-98ea-40ea-ba93-491e54464e67","Type":"ContainerStarted","Data":"9c1927a4cbd23598bc10edaeb1f77e5703a9691d475434074d21771b6640f381"} Apr 17 08:10:24.057322 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:24.057268 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" event={"ID":"ba2e3480-98ea-40ea-ba93-491e54464e67","Type":"ContainerStarted","Data":"7734017d17651c3a0fc9caf79ff220261765cec8e516caa69d2bf06a1afb39b2"} Apr 17 08:10:24.057513 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:24.057369 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:24.071068 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:24.071001 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" podStartSLOduration=1.070985015 podStartE2EDuration="1.070985015s" podCreationTimestamp="2026-04-17 08:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:10:24.069419797 +0000 UTC m=+462.156202483" watchObservedRunningTime="2026-04-17 08:10:24.070985015 +0000 UTC m=+462.157767704" Apr 17 08:10:24.371084 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:24.370986 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pk77j_5f3f7f47-4941-40cd-88d3-259605376e0e/node-ca/0.log" Apr 17 08:10:25.051265 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:25.051232 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7c6d4f57f6-pj78j_06a60109-bb64-4cd2-9f4e-2987a8942aad/router/0.log" Apr 17 08:10:25.379007 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:25.378931 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-m7zgj_b4ebbaa6-1054-4eaf-87ba-d84dc8af620f/serve-healthcheck-canary/0.log" Apr 17 08:10:25.870529 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:25.870505 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-thndg_865075e4-18ea-4f03-a7e8-c85efbee42bc/kube-rbac-proxy/0.log" Apr 17 08:10:25.889991 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:25.889965 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-thndg_865075e4-18ea-4f03-a7e8-c85efbee42bc/exporter/0.log" Apr 17 08:10:25.912213 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:25.912190 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-thndg_865075e4-18ea-4f03-a7e8-c85efbee42bc/extractor/0.log" Apr 17 08:10:27.434001 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:27.433974 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-74d99cf557-vs8p6_4bcb351d-bf18-4f47-9cc0-5e266e2073ce/manager/0.log" Apr 17 08:10:30.069388 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:30.069360 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-x2pdk" Apr 17 08:10:30.706958 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:30.706897 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fx7kf_2271c85d-959b-40a9-aaca-4f7851b44b73/kube-storage-version-migrator-operator/1.log" Apr 17 08:10:30.707868 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:30.707849 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fx7kf_2271c85d-959b-40a9-aaca-4f7851b44b73/kube-storage-version-migrator-operator/0.log" Apr 17 08:10:31.747345 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.747314 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/kube-multus-additional-cni-plugins/0.log" Apr 17 08:10:31.770683 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.770659 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/egress-router-binary-copy/0.log" Apr 17 08:10:31.793336 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.793309 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/cni-plugins/0.log" Apr 17 08:10:31.816016 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.815988 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/bond-cni-plugin/0.log" Apr 17 08:10:31.841630 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.841605 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/routeoverride-cni/0.log" Apr 17 08:10:31.865670 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.865644 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/whereabouts-cni-bincopy/0.log" Apr 17 08:10:31.885759 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:31.885737 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s742h_3a741811-1651-45b8-97c3-04e9be19b874/whereabouts-cni/0.log" Apr 17 08:10:32.077969 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:32.077884 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gn8ng_d961e954-15d3-43c1-800e-ea0f8d3c806d/kube-multus/0.log" Apr 17 08:10:32.163781 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:32.163750 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5zj7l_a358126d-f138-41e0-b8a2-598652e544f5/network-metrics-daemon/0.log" Apr 17 08:10:32.183990 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:32.183965 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5zj7l_a358126d-f138-41e0-b8a2-598652e544f5/kube-rbac-proxy/0.log" Apr 17 08:10:33.351358 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.351270 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-controller/0.log" Apr 17 08:10:33.369920 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.369876 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/0.log" Apr 17 08:10:33.371917 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.371887 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovn-acl-logging/1.log" Apr 17 08:10:33.389857 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.389834 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/kube-rbac-proxy-node/0.log" Apr 17 08:10:33.410156 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.410129 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:10:33.429317 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.429284 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/northd/0.log" Apr 17 08:10:33.449030 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.449004 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/nbdb/0.log" Apr 17 08:10:33.469007 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.468981 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/sbdb/0.log" Apr 17 08:10:33.564732 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:33.564696 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qrq7z_a9582e01-007d-4cf4-9287-1788560d38e1/ovnkube-controller/0.log" Apr 17 08:10:34.875762 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:34.875692 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-xbqhk_670156cd-215c-4009-94ea-07e2bf7f784c/check-endpoints/0.log" Apr 17 08:10:34.897807 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:34.897778 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-52zv7_d7bf9c6a-1777-45f6-9f27-2ee3d09959d1/network-check-target-container/0.log" Apr 17 08:10:35.731639 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:35.731603 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lr452_89794c28-aca4-44e3-899f-a486defd216d/iptables-alerter/0.log" Apr 17 08:10:36.393688 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:36.393657 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nzxn9_e8a42681-cd32-4735-af7b-2c1ca0b2df67/tuned/0.log" Apr 17 08:10:39.118987 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:39.118957 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-nxg4g_42567f9e-9c19-445e-9f5e-fea94d754bb7/service-ca-controller/0.log" Apr 17 08:10:39.455325 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:39.455300 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-56fr6_e0b1c45e-6f25-419a-b20e-afc106313813/csi-driver/0.log" Apr 17 08:10:39.475749 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:39.475729 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-56fr6_e0b1c45e-6f25-419a-b20e-afc106313813/csi-node-driver-registrar/0.log" Apr 17 08:10:39.495935 ip-10-0-138-54 kubenswrapper[2583]: I0417 08:10:39.495914 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-56fr6_e0b1c45e-6f25-419a-b20e-afc106313813/csi-liveness-probe/0.log"