Apr 17 14:23:21.333600 ip-10-0-129-199 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:23:21.333614 ip-10-0-129-199 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:23:21.333623 ip-10-0-129-199 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:23:21.333937 ip-10-0-129-199 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:23:31.519591 ip-10-0-129-199 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:23:31.519608 ip-10-0-129-199 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7c91e410db814fe0b66b4d6c3eb712e2 -- Apr 17 14:26:03.692270 ip-10-0-129-199 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:26:04.133587 ip-10-0-129-199 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:26:04.133587 ip-10-0-129-199 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:26:04.133587 ip-10-0-129-199 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:26:04.133587 ip-10-0-129-199 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:26:04.133587 ip-10-0-129-199 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:26:04.135186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.135066 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:26:04.138194 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138178 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:04.138194 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138195 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138199 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138202 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138205 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138211 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138221 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138224 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138227 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138229 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138232 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138235 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138238 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138241 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138243 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138246 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138248 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138251 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138254 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138257 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138259 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:04.138263 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138262 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138265 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138268 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138271 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138273 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138277 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138279 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138282 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138285 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138288 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138291 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138293 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138296 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138298 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138301 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138303 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138306 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138308 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138311 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138313 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:04.138738 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138316 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138318 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138322 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138325 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138327 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138330 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138332 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138335 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138337 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138339 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138342 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138344 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138347 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138349 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138353 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138355 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138358 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138361 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138363 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138366 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:04.139558 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138368 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138371 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138374 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138377 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138380 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138382 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138385 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138387 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138390 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138394 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138396 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138399 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138403 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138406 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138409 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138411 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138414 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138417 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138420 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138422 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:04.140181 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138424 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138427 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138430 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138433 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.138436 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139320 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139332 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139337 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139342 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139347 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139352 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139356 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139360 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139365 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139370 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139374 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139384 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139388 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139393 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139397 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:04.140682 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139401 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139406 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139410 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139415 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139419 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139426 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139431 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139435 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139444 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139449 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139454 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139458 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139462 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139467 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139471 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139475 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139479 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139484 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139488 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:04.141174 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139495 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139505 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139509 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139514 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139519 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139523 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139527 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139532 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139537 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139541 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139545 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139549 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139553 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139557 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139565 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139570 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139574 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139578 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139582 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139586 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:04.141644 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139591 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139595 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139602 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139606 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139611 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139615 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139624 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139628 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139632 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139636 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139641 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139646 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139650 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139654 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139658 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139662 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139667 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139671 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139676 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139685 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:04.142151 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139689 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139694 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139698 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139702 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139709 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139713 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139718 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139722 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139726 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139730 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139734 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.139743 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.139898 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140033 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140051 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140056 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140061 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140064 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140069 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140074 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140078 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140081 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:26:04.142629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140084 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140088 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140091 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140094 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140097 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140100 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140103 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140106 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140109 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140114 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140117 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140121 2572 flags.go:64] FLAG: --config-dir="" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140124 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140128 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140132 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140135 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140138 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140141 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140145 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140148 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140150 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140154 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140157 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140162 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140166 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:26:04.143161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140169 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140174 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140177 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140180 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140185 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140188 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140192 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140195 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140198 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140202 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140205 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140209 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140212 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140215 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140218 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140221 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140224 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140227 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140230 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140233 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140237 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140241 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140244 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140247 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140250 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:26:04.143757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140254 2572 flags.go:64] FLAG: --help="false" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140257 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140260 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140266 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140270 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140274 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140278 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140281 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140284 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140287 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140290 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140293 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140296 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140300 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140303 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140306 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140309 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140312 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140315 2572 flags.go:64] FLAG: --lock-file="" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140327 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140330 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140333 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140339 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:26:04.144375 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140342 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140345 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140348 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140351 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140354 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140357 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140360 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140365 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140368 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140372 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140375 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140378 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140382 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140386 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140390 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140393 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140396 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140406 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140409 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140412 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140415 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140419 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140425 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140428 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:26:04.144961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140431 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140434 2572 flags.go:64] FLAG: --port="10250" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140437 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140440 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-020a8d1152e1ab458" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140443 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140446 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140449 2572 flags.go:64] FLAG: --register-node="true" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140452 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140455 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140463 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140466 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140470 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140473 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140477 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140481 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140484 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140487 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140491 2572 flags.go:64] FLAG: --runonce="false" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140494 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140498 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140501 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140506 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140509 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140512 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140515 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140518 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:26:04.145550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140521 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140524 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140527 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140530 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140533 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140536 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140539 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140545 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140548 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140551 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140556 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140559 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140562 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140565 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140568 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140571 2572 flags.go:64] FLAG: --v="2" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140576 2572 flags.go:64] FLAG: --version="false" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140584 2572 flags.go:64] FLAG: --vmodule="" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140589 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.140592 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140686 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140690 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140694 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140697 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:04.146186 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140700 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140703 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140706 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140710 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140714 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140717 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140719 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140723 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140726 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140728 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140731 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140733 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140736 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140739 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140742 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140744 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140747 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140749 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140752 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140754 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:04.146775 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140758 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140762 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140765 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140768 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140771 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140775 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140778 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140780 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140782 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140785 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140788 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140791 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140793 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140796 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140799 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140803 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140805 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140808 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140811 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140813 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:04.147296 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140816 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140819 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140821 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140824 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140826 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140829 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140832 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140834 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140837 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140839 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140842 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140844 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140847 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140849 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140852 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140855 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140858 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140861 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140864 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140866 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:04.147783 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140869 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140872 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140875 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140877 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140880 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140882 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140885 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140891 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140894 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140898 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140900 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140903 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140906 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140909 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140912 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140914 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140917 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140919 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140922 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:04.148352 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140924 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:04.148823 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140927 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:04.148823 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.140929 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:04.148823 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.141697 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:26:04.149125 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.149104 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:26:04.149157 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.149127 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:26:04.149189 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149174 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:04.149189 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149180 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:04.149189 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149183 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:04.149189 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149186 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:04.149189 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149190 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149194 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149197 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149199 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149202 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149205 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149208 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149211 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149214 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149216 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149219 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149221 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149225 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149227 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149230 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149233 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149236 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149238 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149241 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:04.149310 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149244 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149247 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149250 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149252 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149255 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149258 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149260 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149262 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149266 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149269 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149271 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149274 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149277 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149279 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149282 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149285 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149288 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149291 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149293 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:04.149784 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149296 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149299 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149301 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149304 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149307 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149310 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149312 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149315 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149317 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149322 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149326 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149329 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149332 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149335 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149338 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149340 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149343 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149346 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149348 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:04.150279 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149351 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149353 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149356 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149360 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149362 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149365 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149368 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149370 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149373 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149377 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149379 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149382 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149385 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149387 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149390 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149393 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149395 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149397 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149400 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149402 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:04.150744 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149406 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149410 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149413 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149416 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149419 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.149424 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149527 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149531 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149533 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149536 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149539 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149541 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149544 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149546 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149549 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149552 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:26:04.151239 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149555 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149558 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149561 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149563 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149566 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149568 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149571 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149574 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149576 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149579 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149581 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149584 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149587 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149589 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149592 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149594 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149597 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149601 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149605 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:26:04.151633 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149607 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149610 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149613 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149616 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149619 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149621 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149624 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149626 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149629 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149631 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149634 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149636 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149639 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149641 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149644 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149647 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149649 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149652 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149654 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149657 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:26:04.152114 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149660 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149662 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149664 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149667 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149669 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149672 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149674 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149677 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149679 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149682 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149684 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149687 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149690 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149692 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149695 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149697 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149700 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149702 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149705 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:26:04.152623 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149707 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149710 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149713 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149717 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149720 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149722 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149725 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149727 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149731 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149734 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149736 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149739 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149741 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149744 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149747 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149749 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149752 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:26:04.153093 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:04.149754 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:26:04.153503 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.149759 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:26:04.153503 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.150560 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:26:04.153633 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.153619 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:26:04.154523 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.154512 2572 server.go:1019] "Starting client certificate rotation" Apr 17 14:26:04.154626 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.154611 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:26:04.154656 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.154644 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:26:04.180049 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.180031 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:26:04.183545 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.183530 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:26:04.196472 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.196451 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:26:04.206449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.206429 2572 log.go:25] "Validated CRI v1 image API" Apr 17 14:26:04.206449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.206438 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:26:04.208426 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.208409 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:26:04.213215 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.213191 2572 fs.go:135] Filesystem UUIDs: map[7657fd2d-c3ef-4ea5-bc90-38af1b964f8a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ac35f4f8-7837-47fd-9055-20db7f18eb65:/dev/nvme0n1p4] Apr 17 14:26:04.213301 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.213216 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:26:04.218840 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.218724 2572 manager.go:217] Machine: {Timestamp:2026-04-17 14:26:04.216916339 +0000 UTC m=+0.408515451 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098930 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec294888bd352e2b11ea4f30f23e328a SystemUUID:ec294888-bd35-2e2b-11ea-4f30f23e328a BootID:7c91e410-db81-4fe0-b66b-4d6c3eb712e2 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1b:08:72:29:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1b:08:72:29:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:d6:5c:ce:4d:e4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:26:04.218840 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.218827 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:26:04.219020 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.218968 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:26:04.219933 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.219909 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:26:04.220108 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.219935 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-199.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:26:04.220184 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.220123 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:26:04.220184 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.220134 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:26:04.220184 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.220152 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:26:04.221580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.221567 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:26:04.223152 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.223140 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:26:04.223434 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.223421 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:26:04.225616 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.225605 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:26:04.225686 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.225624 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:26:04.225686 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.225644 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:26:04.225686 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.225659 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:26:04.225686 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.225672 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:26:04.227238 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.227225 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:26:04.227299 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.227248 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:26:04.231604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.231590 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:26:04.232904 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.232887 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:26:04.234635 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234622 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234642 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234651 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234661 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234670 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234679 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234689 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:26:04.234704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234698 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:26:04.234931 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234708 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:26:04.234931 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234717 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:26:04.234931 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234731 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:26:04.234931 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.234746 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:26:04.237726 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.237714 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:26:04.237791 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.237730 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:26:04.238030 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.238010 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 14:26:04.238102 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.238009 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 14:26:04.241236 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.241221 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:26:04.241321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.241263 2572 server.go:1295] "Started kubelet" Apr 17 14:26:04.241373 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.241344 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:26:04.241512 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.241464 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:26:04.241559 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.241536 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:26:04.242183 ip-10-0-129-199 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:26:04.242565 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.242404 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:26:04.242713 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.242555 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:26:04.247152 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.247134 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:26:04.247731 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.247712 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-199.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 14:26:04.247821 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.247723 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:26:04.253900 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.253876 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:26:04.254067 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254035 2572 factory.go:55] Registering systemd factory Apr 17 14:26:04.254172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254099 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:26:04.254363 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.254342 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:26:04.254432 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254418 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9xskg" Apr 17 14:26:04.254510 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254498 2572 factory.go:153] Registering CRI-O factory Apr 17 14:26:04.254564 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254513 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 14:26:04.254616 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254586 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:26:04.254616 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254602 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:26:04.254616 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254609 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:26:04.254754 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254638 2572 factory.go:103] Registering Raw factory Apr 17 14:26:04.254754 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254655 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 14:26:04.254754 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254736 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:26:04.254754 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.254743 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:26:04.254932 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.254499 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.255250 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.255236 2572 manager.go:319] Starting recovery of all containers Apr 17 14:26:04.256714 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.247788 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-199.ec2.internal.18a72b18f6d3ca49 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-199.ec2.internal,UID:ip-10-0-129-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-199.ec2.internal,},FirstTimestamp:2026-04-17 14:26:04.241234505 +0000 UTC m=+0.432833620,LastTimestamp:2026-04-17 14:26:04.241234505 +0000 UTC m=+0.432833620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-199.ec2.internal,}" Apr 17 14:26:04.258704 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.258678 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9xskg" Apr 17 14:26:04.263642 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.263624 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:04.266036 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.266015 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-199.ec2.internal\" not found" node="ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.266304 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.266284 2572 manager.go:324] Recovery completed Apr 17 14:26:04.270672 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.270660 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:04.273225 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.273209 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:04.273306 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.273235 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:04.273306 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.273247 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:04.273717 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.273705 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:26:04.273717 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.273716 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:26:04.273793 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.273731 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:26:04.276222 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.276210 2572 policy_none.go:49] "None policy: Start" Apr 17 14:26:04.276261 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.276224 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:26:04.276261 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.276233 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:26:04.308576 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.308558 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 14:26:04.308687 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.308626 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:26:04.308687 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.308636 2572 server.go:85] "Starting device plugin registration server" Apr 17 14:26:04.308906 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.308893 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:26:04.308961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.308909 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:26:04.309142 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.309118 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:26:04.309232 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.309196 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:26:04.309232 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.309205 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:26:04.309616 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.309597 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:26:04.309693 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.309632 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.397268 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.397177 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:26:04.398430 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.398408 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:26:04.398526 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.398442 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:26:04.398526 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.398464 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:26:04.398526 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.398472 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:26:04.398526 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.398512 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:26:04.400916 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.400891 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:04.409812 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.409792 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:04.410692 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.410672 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:04.410760 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.410706 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:04.410760 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.410718 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:04.410760 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.410741 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.419825 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.419811 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.419884 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.419832 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-199.ec2.internal\": node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.437425 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.437404 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.498754 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.498709 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal"] Apr 17 14:26:04.498856 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.498808 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:04.500182 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.500166 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:04.500253 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.500196 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:04.500253 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.500206 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:04.501632 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.501619 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:04.501760 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.501742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.501860 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.501776 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:04.503243 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.503230 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:04.503329 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.503258 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:04.503329 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.503273 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:04.503611 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.503592 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:04.503659 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.503613 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:04.503659 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.503623 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:04.504428 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.504412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.504510 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.504442 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:26:04.505100 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.505084 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:26:04.505177 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.505105 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:26:04.505177 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.505116 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:26:04.526641 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.526620 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-199.ec2.internal\" not found" node="ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.531091 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.531075 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-199.ec2.internal\" not found" node="ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.538433 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.538419 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.556376 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.556350 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.556471 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.556378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.556471 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.556395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87d4ca475b53fa90f2c794fc65d796bc-config\") pod \"kube-apiserver-proxy-ip-10-0-129-199.ec2.internal\" (UID: \"87d4ca475b53fa90f2c794fc65d796bc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.639034 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.639004 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.657402 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.657335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.657402 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.657365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.657402 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.657377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.657402 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.657397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87d4ca475b53fa90f2c794fc65d796bc-config\") pod \"kube-apiserver-proxy-ip-10-0-129-199.ec2.internal\" (UID: \"87d4ca475b53fa90f2c794fc65d796bc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.657588 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.657430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87d4ca475b53fa90f2c794fc65d796bc-config\") pod \"kube-apiserver-proxy-ip-10-0-129-199.ec2.internal\" (UID: \"87d4ca475b53fa90f2c794fc65d796bc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.657588 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.657441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/416f09eead2ff50b4bdd823b81a2a56e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal\" (UID: \"416f09eead2ff50b4bdd823b81a2a56e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.739763 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.739738 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.828311 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.828282 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.834289 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:04.834273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 17 14:26:04.840566 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.840548 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:04.941188 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:04.941115 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:05.041611 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:05.041583 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:05.114036 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.114011 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:05.141710 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:05.141688 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:05.154952 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.154928 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:26:05.155081 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.155066 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:26:05.155125 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.155081 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:26:05.155125 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.155069 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:26:05.242738 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:05.242658 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-199.ec2.internal\" not found" Apr 17 14:26:05.247843 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.247824 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:26:05.261501 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.261464 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:21:04 +0000 UTC" deadline="2027-10-26 22:56:31.593619081 +0000 UTC" Apr 17 14:26:05.261501 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.261493 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13376h30m26.33212986s" Apr 17 14:26:05.263357 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.263339 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:26:05.285155 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.285138 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:05.285233 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.285157 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8x5nh" Apr 17 14:26:05.292557 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.292538 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8x5nh" Apr 17 14:26:05.353722 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.353698 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" Apr 17 14:26:05.367125 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.367105 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:26:05.368176 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.367970 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" Apr 17 14:26:05.375692 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.375671 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:26:05.491771 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:05.491732 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87d4ca475b53fa90f2c794fc65d796bc.slice/crio-bb3addaef46effb4b65b2f682aefce8e10eb35a9440a331636c8e644652e3e8b WatchSource:0}: Error finding container bb3addaef46effb4b65b2f682aefce8e10eb35a9440a331636c8e644652e3e8b: Status 404 returned error can't find the container with id bb3addaef46effb4b65b2f682aefce8e10eb35a9440a331636c8e644652e3e8b Apr 17 14:26:05.492193 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:05.492152 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod416f09eead2ff50b4bdd823b81a2a56e.slice/crio-20b6d53c8064bbc5c5f992d8a51ec495a3b0f81149184efd24287bdfef0d4317 WatchSource:0}: Error finding container 20b6d53c8064bbc5c5f992d8a51ec495a3b0f81149184efd24287bdfef0d4317: Status 404 returned error can't find the container with id 20b6d53c8064bbc5c5f992d8a51ec495a3b0f81149184efd24287bdfef0d4317 Apr 17 14:26:05.496341 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:05.496315 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:26:06.227698 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.227412 2572 apiserver.go:52] "Watching apiserver" Apr 17 14:26:06.236085 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.235899 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:26:06.237304 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.237221 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7sxz5","kube-system/global-pull-secret-syncer-dxl24","kube-system/konnectivity-agent-7n9xk","kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal","openshift-cluster-node-tuning-operator/tuned-hgwmv","openshift-dns/node-resolver-l87rd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal","openshift-multus/multus-additional-cni-plugins-lw47q","openshift-multus/network-metrics-daemon-tr5b8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw","openshift-image-registry/node-ca-h8bbt","openshift-multus/multus-7rgwd","openshift-network-diagnostics/network-check-target-fg9xk","openshift-network-operator/iptables-alerter-k5l86"] Apr 17 14:26:06.242433 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.242411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.244758 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.244732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.245069 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.245048 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:26:06.245161 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.245141 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.245217 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.245165 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:26:06.245367 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.245317 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sfssv\"" Apr 17 14:26:06.245995 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.245899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.246739 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.246710 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:26:06.248666 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.248645 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.250013 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.248829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.250013 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.249646 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:26:06.250013 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.249768 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dw7qc\"" Apr 17 14:26:06.251260 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.250839 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.251872 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.251354 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:26:06.251872 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.251477 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xfn8s\"" Apr 17 14:26:06.251872 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.251622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:26:06.251872 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.251741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:26:06.252162 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.252003 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l6hmw\"" Apr 17 14:26:06.252684 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.252468 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.252684 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.252560 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:26:06.252684 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.252543 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.253345 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.253324 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:26:06.253432 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.253326 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.255025 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.255007 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.257093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.256906 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.257093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.257023 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.257093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.257044 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8qdf9\"" Apr 17 14:26:06.257729 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.257355 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.259821 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.259584 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.259821 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.259630 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.259969 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.259867 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s8hdz\"" Apr 17 14:26:06.260118 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.260102 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:06.260207 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.260185 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:06.262519 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.262495 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.264539 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-kube-api-access-zlwlp\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.264539 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-run-netns\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.264710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-run-ovn-kubernetes\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.264710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysctl-d\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.264710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-k8s-cni-cncf-io\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.264710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-lib-modules\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.264710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-kubelet\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264743 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-etc-kubernetes\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1391e92c-990f-482b-b724-0f08633e4abf-agent-certs\") pod \"konnectivity-agent-7n9xk\" (UID: \"1391e92c-990f-482b-b724-0f08633e4abf\") " pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nzr4f\"" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-sys\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-tuned\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264847 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-socket-dir-parent\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsgg\" (UniqueName: \"kubernetes.io/projected/a660c0da-e639-42a0-9f51-34051bd62e42-kube-api-access-7rsgg\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysconfig\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.264939 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:06.264954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.264962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-systemd\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a660c0da-e639-42a0-9f51-34051bd62e42-iptables-alerter-script\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-log-socket\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxt6q\" (UniqueName: \"kubernetes.io/projected/049f13c9-ae95-43ce-9dac-c99d4c07f258-kube-api-access-rxt6q\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysctl-conf\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92g4s\" (UniqueName: \"kubernetes.io/projected/f1db9491-104f-4aed-8ac3-5c92d3bf1803-kube-api-access-92g4s\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovnkube-config\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-host\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw9bk\" (UniqueName: \"kubernetes.io/projected/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-kube-api-access-vw9bk\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-tmp-dir\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-cni-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-netns\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265327 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-cni-bin\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-kubelet\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-slash\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-systemd\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-var-lib-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.265604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-os-release\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-cni-multus\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovn-node-metrics-cert\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-modprobe-d\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-kubernetes\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-tmp\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-hostroot\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-cnibin\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-multus-certs\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-etc-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-env-overrides\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-run\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-var-lib-kubelet\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-system-cni-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-conf-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-cni-netd\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.266508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.265969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-daemon-config\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-cni-bin\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-hosts-file\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1db9491-104f-4aed-8ac3-5c92d3bf1803-cni-binary-copy\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a660c0da-e639-42a0-9f51-34051bd62e42-host-slash\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1391e92c-990f-482b-b724-0f08633e4abf-konnectivity-ca\") pod \"konnectivity-agent-7n9xk\" (UID: \"1391e92c-990f-482b-b724-0f08633e4abf\") " pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-systemd-units\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovnkube-script-lib\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-ovn\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.267188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.266305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-node-log\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.267680 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.267628 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.267735 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.267712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.270170 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.269826 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:26:06.270170 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.269873 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.270170 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.269900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.270170 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.269959 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:06.270170 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.270026 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:26:06.270419 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.270213 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:26:06.270419 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.270226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bj4fz\"" Apr 17 14:26:06.270419 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.270254 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.270648 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.270446 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:26:06.270648 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.270529 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gkxh9\"" Apr 17 14:26:06.295204 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.295177 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:21:05 +0000 UTC" deadline="2027-11-17 04:24:22.809885385 +0000 UTC" Apr 17 14:26:06.295204 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.295203 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13885h58m16.514686163s" Apr 17 14:26:06.355723 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.355698 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:26:06.366792 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366765 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.366924 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-socket-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.366924 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn477\" (UniqueName: \"kubernetes.io/projected/9244d507-858e-449a-af7a-1e8df83a35ae-kube-api-access-fn477\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.366924 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-hosts-file\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.366924 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1db9491-104f-4aed-8ac3-5c92d3bf1803-cni-binary-copy\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.366924 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a660c0da-e639-42a0-9f51-34051bd62e42-host-slash\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-hosts-file\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.366964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a660c0da-e639-42a0-9f51-34051bd62e42-host-slash\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1391e92c-990f-482b-b724-0f08633e4abf-konnectivity-ca\") pod \"konnectivity-agent-7n9xk\" (UID: \"1391e92c-990f-482b-b724-0f08633e4abf\") " pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-systemd-units\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovnkube-script-lib\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-ovn\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-node-log\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-kube-api-access-zlwlp\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-run-netns\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367219 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-cnibin\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-run-ovn-kubernetes\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysctl-d\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-k8s-cni-cncf-io\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-os-release\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-lib-modules\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367415 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-kubelet\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-etc-kubernetes\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1391e92c-990f-482b-b724-0f08633e4abf-agent-certs\") pod \"konnectivity-agent-7n9xk\" (UID: \"1391e92c-990f-482b-b724-0f08633e4abf\") " pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367531 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-sys\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-tuned\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-socket-dir-parent\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsgg\" (UniqueName: \"kubernetes.io/projected/a660c0da-e639-42a0-9f51-34051bd62e42-kube-api-access-7rsgg\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cb28525-d291-4976-8923-d29d0747a7dd-host\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367642 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.367720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysconfig\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-systemd\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a660c0da-e639-42a0-9f51-34051bd62e42-iptables-alerter-script\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-log-socket\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxt6q\" (UniqueName: \"kubernetes.io/projected/049f13c9-ae95-43ce-9dac-c99d4c07f258-kube-api-access-rxt6q\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367791 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-kubelet-config\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-registration-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysctl-conf\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92g4s\" (UniqueName: \"kubernetes.io/projected/f1db9491-104f-4aed-8ac3-5c92d3bf1803-kube-api-access-92g4s\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovnkube-config\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb28525-d291-4976-8923-d29d0747a7dd-serviceca\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-host\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw9bk\" (UniqueName: \"kubernetes.io/projected/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-kube-api-access-vw9bk\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-tmp-dir\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysconfig\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.368580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1391e92c-990f-482b-b724-0f08633e4abf-konnectivity-ca\") pod \"konnectivity-agent-7n9xk\" (UID: \"1391e92c-990f-482b-b724-0f08633e4abf\") " pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-systemd\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-run-ovn-kubernetes\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-systemd-units\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-cni-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysctl-d\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-ovn\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-k8s-cni-cncf-io\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-lib-modules\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.367578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1db9491-104f-4aed-8ac3-5c92d3bf1803-cni-binary-copy\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a660c0da-e639-42a0-9f51-34051bd62e42-iptables-alerter-script\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-log-socket\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368729 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovnkube-script-lib\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-node-log\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-tmp-dir\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-cni-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.368995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-netns\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.369354 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-run-netns\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-cni-bin\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-kubelet\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369049 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-netns\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-slash\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-systemd\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-cni-bin\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-var-lib-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-sysctl-conf\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369166 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-os-release\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-cni-multus\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-os-release\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovn-node-metrics-cert\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-kubelet\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pb8\" (UniqueName: \"kubernetes.io/projected/7cbba96f-e882-4326-b2b6-5a647245170e-kube-api-access-g6pb8\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-etc-selinux\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.370147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovnkube-config\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkq4m\" (UniqueName: \"kubernetes.io/projected/6cb28525-d291-4976-8923-d29d0747a7dd-kube-api-access-lkq4m\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-systemd\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-modprobe-d\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-cni-multus\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-var-lib-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-kubernetes\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-run-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-sys\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-var-lib-kubelet\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-kubernetes\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-modprobe-d\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-socket-dir-parent\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-slash\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369571 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-host\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-tmp\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-etc-kubernetes\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-hostroot\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.370850 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8jn\" (UniqueName: \"kubernetes.io/projected/85225238-59be-4245-9db5-4b8fa923bf28-kube-api-access-tw8jn\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-device-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-hostroot\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-cnibin\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-multus-certs\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-cnibin\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-etc-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-host-run-multus-certs\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-env-overrides\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-etc-openvswitch\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-dbus\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-system-cni-dir\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-run\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-var-lib-kubelet\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-system-cni-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-conf-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370225 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/049f13c9-ae95-43ce-9dac-c99d4c07f258-env-overrides\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-cni-netd\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.371678 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370268 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-sys-fs\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-var-lib-kubelet\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.369995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-run\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-conf-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-daemon-config\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-cni-netd\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370379 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1db9491-104f-4aed-8ac3-5c92d3bf1803-system-cni-dir\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-cni-bin\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-cni-binary-copy\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/049f13c9-ae95-43ce-9dac-c99d4c07f258-host-cni-bin\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370522 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.370827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f1db9491-104f-4aed-8ac3-5c92d3bf1803-multus-daemon-config\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.372535 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.372413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-etc-tuned\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.373218 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.372595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-tmp\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.373218 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.372936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/049f13c9-ae95-43ce-9dac-c99d4c07f258-ovn-node-metrics-cert\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.373218 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.373031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1391e92c-990f-482b-b724-0f08633e4abf-agent-certs\") pod \"konnectivity-agent-7n9xk\" (UID: \"1391e92c-990f-482b-b724-0f08633e4abf\") " pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.378097 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.377970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw9bk\" (UniqueName: \"kubernetes.io/projected/6cef70aa-7a75-4dad-8f4c-b08582ff5b85-kube-api-access-vw9bk\") pod \"tuned-hgwmv\" (UID: \"6cef70aa-7a75-4dad-8f4c-b08582ff5b85\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.378599 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.378461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsgg\" (UniqueName: \"kubernetes.io/projected/a660c0da-e639-42a0-9f51-34051bd62e42-kube-api-access-7rsgg\") pod \"iptables-alerter-k5l86\" (UID: \"a660c0da-e639-42a0-9f51-34051bd62e42\") " pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.379167 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.378954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92g4s\" (UniqueName: \"kubernetes.io/projected/f1db9491-104f-4aed-8ac3-5c92d3bf1803-kube-api-access-92g4s\") pod \"multus-7rgwd\" (UID: \"f1db9491-104f-4aed-8ac3-5c92d3bf1803\") " pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.379392 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.379341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/4fe5daab-b27c-4fc1-88c0-72d89cdccb01-kube-api-access-zlwlp\") pod \"node-resolver-l87rd\" (UID: \"4fe5daab-b27c-4fc1-88c0-72d89cdccb01\") " pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.380390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.380367 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxt6q\" (UniqueName: \"kubernetes.io/projected/049f13c9-ae95-43ce-9dac-c99d4c07f258-kube-api-access-rxt6q\") pod \"ovnkube-node-7sxz5\" (UID: \"049f13c9-ae95-43ce-9dac-c99d4c07f258\") " pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.405302 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.405237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" event={"ID":"416f09eead2ff50b4bdd823b81a2a56e","Type":"ContainerStarted","Data":"20b6d53c8064bbc5c5f992d8a51ec495a3b0f81149184efd24287bdfef0d4317"} Apr 17 14:26:06.406321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.406295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" event={"ID":"87d4ca475b53fa90f2c794fc65d796bc","Type":"ContainerStarted","Data":"bb3addaef46effb4b65b2f682aefce8e10eb35a9440a331636c8e644652e3e8b"} Apr 17 14:26:06.461728 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.461698 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:06.471729 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.471702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-dbus\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.471862 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.471745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-system-cni-dir\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.471862 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.471851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-system-cni-dir\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.471961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.471893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-dbus\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.472031 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.471897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472031 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-sys-fs\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472031 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-cni-binary-copy\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472048 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472085 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-socket-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn477\" (UniqueName: \"kubernetes.io/projected/9244d507-858e-449a-af7a-1e8df83a35ae-kube-api-access-fn477\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:06.472172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-cnibin\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-os-release\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472229 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-sys-fs\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cb28525-d291-4976-8923-d29d0747a7dd-host\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472240 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-socket-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-kubelet-config\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-os-release\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-registration-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb28525-d291-4976-8923-d29d0747a7dd-serviceca\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pb8\" (UniqueName: \"kubernetes.io/projected/7cbba96f-e882-4326-b2b6-5a647245170e-kube-api-access-g6pb8\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-etc-selinux\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkq4m\" (UniqueName: \"kubernetes.io/projected/6cb28525-d291-4976-8923-d29d0747a7dd-kube-api-access-lkq4m\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.472490 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472475 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-cnibin\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8jn\" (UniqueName: \"kubernetes.io/projected/85225238-59be-4245-9db5-4b8fa923bf28-kube-api-access-tw8jn\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-cni-binary-copy\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.472560 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-registration-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cb28525-d291-4976-8923-d29d0747a7dd-host\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-kubelet-config\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-etc-selinux\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7cbba96f-e882-4326-b2b6-5a647245170e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.472645 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:06.97261274 +0000 UTC m=+3.164211843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472675 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7cbba96f-e882-4326-b2b6-5a647245170e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-device-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.472764 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.472778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9244d507-858e-449a-af7a-1e8df83a35ae-device-dir\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.472855 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:26:06.972830977 +0000 UTC m=+3.164430089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:06.473193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.473033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb28525-d291-4976-8923-d29d0747a7dd-serviceca\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.484666 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.484549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn477\" (UniqueName: \"kubernetes.io/projected/9244d507-858e-449a-af7a-1e8df83a35ae-kube-api-access-fn477\") pod \"aws-ebs-csi-driver-node-dg7bw\" (UID: \"9244d507-858e-449a-af7a-1e8df83a35ae\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.488828 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.488805 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:06.488828 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.488832 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:06.489003 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.488848 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:06.489003 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.488914 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:26:06.988898025 +0000 UTC m=+3.180497126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:06.491377 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.491354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pb8\" (UniqueName: \"kubernetes.io/projected/7cbba96f-e882-4326-b2b6-5a647245170e-kube-api-access-g6pb8\") pod \"multus-additional-cni-plugins-lw47q\" (UID: \"7cbba96f-e882-4326-b2b6-5a647245170e\") " pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.491454 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.491388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkq4m\" (UniqueName: \"kubernetes.io/projected/6cb28525-d291-4976-8923-d29d0747a7dd-kube-api-access-lkq4m\") pod \"node-ca-h8bbt\" (UID: \"6cb28525-d291-4976-8923-d29d0747a7dd\") " pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.491622 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.491597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8jn\" (UniqueName: \"kubernetes.io/projected/85225238-59be-4245-9db5-4b8fa923bf28-kube-api-access-tw8jn\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.548170 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.548137 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:06.558119 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.558098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7rgwd" Apr 17 14:26:06.566791 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.566771 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:06.575557 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.575535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:06.583122 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.583101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k5l86" Apr 17 14:26:06.591670 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.591651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" Apr 17 14:26:06.597199 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.597180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l87rd" Apr 17 14:26:06.605806 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.605788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lw47q" Apr 17 14:26:06.613428 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.613408 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" Apr 17 14:26:06.623120 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.623098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h8bbt" Apr 17 14:26:06.975485 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.975397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:06.975485 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:06.975447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:06.975682 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.975550 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:06.975682 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.975611 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:07.975596594 +0000 UTC m=+4.167195692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:06.975682 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.975550 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:06.975803 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:06.975718 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:26:07.975698002 +0000 UTC m=+4.167297125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:07.076739 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.076710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:07.076907 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.076884 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:07.076972 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.076911 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:07.076972 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.076925 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:07.077076 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.077009 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:26:08.076967501 +0000 UTC m=+4.268566613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:07.286844 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.286809 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049f13c9_ae95_43ce_9dac_c99d4c07f258.slice/crio-3c1570006f5a233b1f2a67fe58b4151c8745d179b28f0f92e6dae820f6b389eb WatchSource:0}: Error finding container 3c1570006f5a233b1f2a67fe58b4151c8745d179b28f0f92e6dae820f6b389eb: Status 404 returned error can't find the container with id 3c1570006f5a233b1f2a67fe58b4151c8745d179b28f0f92e6dae820f6b389eb Apr 17 14:26:07.288148 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.288121 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fe5daab_b27c_4fc1_88c0_72d89cdccb01.slice/crio-4f57402961c7546abb73fc76a596ce52af72df2945eef495baf0ee88c1a316d6 WatchSource:0}: Error finding container 4f57402961c7546abb73fc76a596ce52af72df2945eef495baf0ee88c1a316d6: Status 404 returned error can't find the container with id 4f57402961c7546abb73fc76a596ce52af72df2945eef495baf0ee88c1a316d6 Apr 17 14:26:07.289287 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.289241 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cef70aa_7a75_4dad_8f4c_b08582ff5b85.slice/crio-e1da7bc866e3c9483349a5c8db2d6f61e5db1cc8ec42067d7f450b6a7b859a4a WatchSource:0}: Error finding container e1da7bc866e3c9483349a5c8db2d6f61e5db1cc8ec42067d7f450b6a7b859a4a: Status 404 returned error can't find the container with id e1da7bc866e3c9483349a5c8db2d6f61e5db1cc8ec42067d7f450b6a7b859a4a Apr 17 14:26:07.292440 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.292417 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cbba96f_e882_4326_b2b6_5a647245170e.slice/crio-c11aae0c4936a2f71eb8430506627cb88172d5260d416d5408498f9517401e37 WatchSource:0}: Error finding container c11aae0c4936a2f71eb8430506627cb88172d5260d416d5408498f9517401e37: Status 404 returned error can't find the container with id c11aae0c4936a2f71eb8430506627cb88172d5260d416d5408498f9517401e37 Apr 17 14:26:07.293309 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.293289 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1db9491_104f_4aed_8ac3_5c92d3bf1803.slice/crio-06e5ef889449169d89ba1e7d82313c7b0f8fbbff1ffeb25c36e2632d7c35b472 WatchSource:0}: Error finding container 06e5ef889449169d89ba1e7d82313c7b0f8fbbff1ffeb25c36e2632d7c35b472: Status 404 returned error can't find the container with id 06e5ef889449169d89ba1e7d82313c7b0f8fbbff1ffeb25c36e2632d7c35b472 Apr 17 14:26:07.294078 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.294046 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda660c0da_e639_42a0_9f51_34051bd62e42.slice/crio-183a85dd971718675c02ca9d150a2e0a6e2db8b0c59d6efa61050c76c4b8a2e8 WatchSource:0}: Error finding container 183a85dd971718675c02ca9d150a2e0a6e2db8b0c59d6efa61050c76c4b8a2e8: Status 404 returned error can't find the container with id 183a85dd971718675c02ca9d150a2e0a6e2db8b0c59d6efa61050c76c4b8a2e8 Apr 17 14:26:07.296195 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.295115 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1391e92c_990f_482b_b724_0f08633e4abf.slice/crio-a590f8bec21d552b417512241abe6cc61ac59655ee08a61ea6af268b64fcf964 WatchSource:0}: Error finding container a590f8bec21d552b417512241abe6cc61ac59655ee08a61ea6af268b64fcf964: Status 404 returned error can't find the container with id a590f8bec21d552b417512241abe6cc61ac59655ee08a61ea6af268b64fcf964 Apr 17 14:26:07.296195 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.295388 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:21:05 +0000 UTC" deadline="2027-09-25 21:53:10.980724148 +0000 UTC" Apr 17 14:26:07.296195 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.295414 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12631h27m3.685313551s" Apr 17 14:26:07.297145 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.296634 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cb28525_d291_4976_8923_d29d0747a7dd.slice/crio-bf282c51ec635ab0701be96a5f484f74f26752f53242203e8797556991504d34 WatchSource:0}: Error finding container bf282c51ec635ab0701be96a5f484f74f26752f53242203e8797556991504d34: Status 404 returned error can't find the container with id bf282c51ec635ab0701be96a5f484f74f26752f53242203e8797556991504d34 Apr 17 14:26:07.297445 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:26:07.297403 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9244d507_858e_449a_af7a_1e8df83a35ae.slice/crio-ca8b0ec9a6009ce126d533e4d9404e73435a27e45986c38e74dc26a2f53e6eb4 WatchSource:0}: Error finding container ca8b0ec9a6009ce126d533e4d9404e73435a27e45986c38e74dc26a2f53e6eb4: Status 404 returned error can't find the container with id ca8b0ec9a6009ce126d533e4d9404e73435a27e45986c38e74dc26a2f53e6eb4 Apr 17 14:26:07.408415 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.408387 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" event={"ID":"9244d507-858e-449a-af7a-1e8df83a35ae","Type":"ContainerStarted","Data":"ca8b0ec9a6009ce126d533e4d9404e73435a27e45986c38e74dc26a2f53e6eb4"} Apr 17 14:26:07.409324 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.409302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7rgwd" event={"ID":"f1db9491-104f-4aed-8ac3-5c92d3bf1803","Type":"ContainerStarted","Data":"06e5ef889449169d89ba1e7d82313c7b0f8fbbff1ffeb25c36e2632d7c35b472"} Apr 17 14:26:07.410110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.410091 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerStarted","Data":"c11aae0c4936a2f71eb8430506627cb88172d5260d416d5408498f9517401e37"} Apr 17 14:26:07.410927 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.410910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"3c1570006f5a233b1f2a67fe58b4151c8745d179b28f0f92e6dae820f6b389eb"} Apr 17 14:26:07.411780 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.411764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h8bbt" event={"ID":"6cb28525-d291-4976-8923-d29d0747a7dd","Type":"ContainerStarted","Data":"bf282c51ec635ab0701be96a5f484f74f26752f53242203e8797556991504d34"} Apr 17 14:26:07.412600 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.412583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7n9xk" event={"ID":"1391e92c-990f-482b-b724-0f08633e4abf","Type":"ContainerStarted","Data":"a590f8bec21d552b417512241abe6cc61ac59655ee08a61ea6af268b64fcf964"} Apr 17 14:26:07.413439 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.413413 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k5l86" event={"ID":"a660c0da-e639-42a0-9f51-34051bd62e42","Type":"ContainerStarted","Data":"183a85dd971718675c02ca9d150a2e0a6e2db8b0c59d6efa61050c76c4b8a2e8"} Apr 17 14:26:07.414222 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.414199 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" event={"ID":"6cef70aa-7a75-4dad-8f4c-b08582ff5b85","Type":"ContainerStarted","Data":"e1da7bc866e3c9483349a5c8db2d6f61e5db1cc8ec42067d7f450b6a7b859a4a"} Apr 17 14:26:07.415087 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.415067 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l87rd" event={"ID":"4fe5daab-b27c-4fc1-88c0-72d89cdccb01","Type":"ContainerStarted","Data":"4f57402961c7546abb73fc76a596ce52af72df2945eef495baf0ee88c1a316d6"} Apr 17 14:26:07.416370 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.416351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" event={"ID":"87d4ca475b53fa90f2c794fc65d796bc","Type":"ContainerStarted","Data":"0e6cf1b9b8a023d985f727154e44b5a9ce361eef95c56c5d81c9f7f7c01eee8b"} Apr 17 14:26:07.432087 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.432044 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-199.ec2.internal" podStartSLOduration=2.432033085 podStartE2EDuration="2.432033085s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:26:07.431679643 +0000 UTC m=+3.623278762" watchObservedRunningTime="2026-04-17 14:26:07.432033085 +0000 UTC m=+3.623632205" Apr 17 14:26:07.918048 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.918015 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:26:07.983689 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.983637 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:07.983854 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:07.983723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:07.983854 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.983846 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:07.983966 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.983909 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:26:09.983891805 +0000 UTC m=+6.175490908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:07.984378 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.984333 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:07.984478 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:07.984385 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:09.984370242 +0000 UTC m=+6.175969344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:08.084341 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:08.084291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:08.084516 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.084497 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:08.084582 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.084517 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:08.084582 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.084530 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:08.084672 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.084588 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:26:10.084570364 +0000 UTC m=+6.276169478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:08.402765 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:08.401816 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:08.402765 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.401953 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:08.402765 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:08.402389 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:08.402765 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.402481 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:08.402765 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:08.402556 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:08.402765 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:08.402626 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:08.434804 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:08.434735 2572 generic.go:358] "Generic (PLEG): container finished" podID="416f09eead2ff50b4bdd823b81a2a56e" containerID="7b571d0867fcab45958c46512ea9e5019dd53eb12ee01053a6fbaa75c2cc02b1" exitCode=0 Apr 17 14:26:08.435666 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:08.435637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" event={"ID":"416f09eead2ff50b4bdd823b81a2a56e","Type":"ContainerDied","Data":"7b571d0867fcab45958c46512ea9e5019dd53eb12ee01053a6fbaa75c2cc02b1"} Apr 17 14:26:09.450260 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:09.450220 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" event={"ID":"416f09eead2ff50b4bdd823b81a2a56e","Type":"ContainerStarted","Data":"a7ec42e2bf44bd8346376dc3f9283a349bba1710a41991f044a59f75ca46d6a3"} Apr 17 14:26:09.466575 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:09.466524 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-199.ec2.internal" podStartSLOduration=4.466507321 podStartE2EDuration="4.466507321s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:26:09.465172064 +0000 UTC m=+5.656771186" watchObservedRunningTime="2026-04-17 14:26:09.466507321 +0000 UTC m=+5.658106442" Apr 17 14:26:10.003156 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:10.002406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:10.003156 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:10.002481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:10.003156 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.002597 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:10.003156 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.002660 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:26:14.002642777 +0000 UTC m=+10.194241877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:10.003156 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.003061 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:10.003156 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.003111 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:14.003095616 +0000 UTC m=+10.194694720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:10.103938 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:10.103335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:10.103938 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.103521 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:10.103938 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.103542 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:10.103938 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.103556 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:10.103938 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.103613 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:26:14.103595076 +0000 UTC m=+10.295194201 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:10.400272 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:10.400102 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:10.400272 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.400244 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:10.400482 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:10.400454 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:10.400540 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:10.400522 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:10.400782 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.400628 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:10.400782 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:10.400702 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:12.399234 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:12.399198 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:12.399681 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:12.399338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:12.399681 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:12.399339 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:12.399681 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:12.399413 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:12.399681 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:12.399564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:12.399681 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:12.399638 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:14.039251 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:14.039039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:14.039251 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:14.039117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:14.039251 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.039192 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:14.039251 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.039206 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:14.039251 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.039263 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:22.039244244 +0000 UTC m=+18.230843362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:14.039917 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.039281 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:26:22.039272824 +0000 UTC m=+18.230871922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:14.140208 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:14.140171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:14.140401 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.140347 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:14.140401 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.140372 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:14.140401 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.140384 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:14.140547 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.140444 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:26:22.140424307 +0000 UTC m=+18.332023413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:14.399497 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:14.399400 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:14.399667 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:14.399533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:14.399667 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:14.399547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:14.399667 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.399515 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:14.399667 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.399651 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:14.399859 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:14.399741 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:16.399606 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:16.399562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:16.400086 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:16.399563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:16.400086 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:16.399711 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:16.400086 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:16.399793 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:16.400086 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:16.399563 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:16.400086 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:16.399886 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:18.399372 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:18.399337 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:18.399823 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:18.399337 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:18.399823 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:18.399488 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:18.399823 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:18.399337 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:18.399823 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:18.399564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:18.399823 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:18.399611 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:20.399393 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:20.399361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:20.399838 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:20.399370 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:20.399838 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:20.399497 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:20.399838 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:20.399545 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:20.399838 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:20.399648 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:20.399838 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:20.399729 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:22.100636 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:22.100594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:22.101105 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:22.100666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:22.101105 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.100741 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:22.101105 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.100762 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:22.101105 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.100831 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.100807874 +0000 UTC m=+34.292406989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:22.101105 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.100856 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.1008461 +0000 UTC m=+34.292445198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:22.201556 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:22.201512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:22.201736 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.201701 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:22.201736 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.201728 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:22.201842 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.201739 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:22.201842 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.201801 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.201781511 +0000 UTC m=+34.393380626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:22.399556 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:22.399472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:22.399702 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:22.399475 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:22.399702 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.399613 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:22.399799 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:22.399488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:22.399799 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.399728 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:22.399891 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:22.399825 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:24.399814 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:24.399662 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:24.400129 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:24.399873 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:24.400129 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:24.399750 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:24.400129 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:24.399929 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:24.400129 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:24.399734 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:24.400129 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:24.400007 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:25.478007 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.477752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h8bbt" event={"ID":"6cb28525-d291-4976-8923-d29d0747a7dd","Type":"ContainerStarted","Data":"a1cd849b001158498392d67c4944c7942d7c0d60d084a00b00e56c8b67a63960"} Apr 17 14:26:25.479442 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.479420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7n9xk" event={"ID":"1391e92c-990f-482b-b724-0f08633e4abf","Type":"ContainerStarted","Data":"e6fc6c91a79fdd40e5f3c0b418601fb3b0634d852016369c0b913b1aef1d3021"} Apr 17 14:26:25.480731 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.480700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" event={"ID":"6cef70aa-7a75-4dad-8f4c-b08582ff5b85","Type":"ContainerStarted","Data":"61e7a14b4efc7940c8ddf9dd39bab0aa2f50528ef65f1a6a3976c3d89f8bd8df"} Apr 17 14:26:25.481993 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.481946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l87rd" event={"ID":"4fe5daab-b27c-4fc1-88c0-72d89cdccb01","Type":"ContainerStarted","Data":"53f26f836e72e69b14f41e6d096c3ae78ffda28889eb0e4d52bcf5f40498a0fa"} Apr 17 14:26:25.483336 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.483316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" event={"ID":"9244d507-858e-449a-af7a-1e8df83a35ae","Type":"ContainerStarted","Data":"394c6e71cfc8afe3d9e3e348c1c76da1b700082c3ac4040a942a8ea8dd8f9d88"} Apr 17 14:26:25.484601 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.484579 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7rgwd" event={"ID":"f1db9491-104f-4aed-8ac3-5c92d3bf1803","Type":"ContainerStarted","Data":"2769fec8e7a0ba67e3114b1ad4304cca51b05b67a6487a948af0d39cdf79fedb"} Apr 17 14:26:25.485933 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.485914 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cbba96f-e882-4326-b2b6-5a647245170e" containerID="57bfd1ac8f2136cf3b4c6cae3145d97d8fe3db49fc11684d616feaba48631d46" exitCode=0 Apr 17 14:26:25.486073 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.485971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerDied","Data":"57bfd1ac8f2136cf3b4c6cae3145d97d8fe3db49fc11684d616feaba48631d46"} Apr 17 14:26:25.488642 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.488622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"629bf3c7383fa849ddf001d7e53e1bbd0a5a7ac7360a8e9ccdf91d874f83739a"} Apr 17 14:26:25.488642 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.488646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"90efad20902551bf1b091ace818189e46a5e6e1b4dd90909e0c6e35c3f65eebd"} Apr 17 14:26:25.488778 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.488661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"a0f730e670d5ceba8a15d9d9dc671a0ff7bcc17573243ba30a86cccff1149e80"} Apr 17 14:26:25.488778 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.488672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"823f942a318301ed63a39700567238bcabf5fe43b203439e1d8dcabaaf0c5666"} Apr 17 14:26:25.488778 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.488682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"4b8d8645257e3ebcedf1f36c2f328559a510b78778335099650228e1dbb268b0"} Apr 17 14:26:25.491773 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.491729 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h8bbt" podStartSLOduration=12.526092773 podStartE2EDuration="21.491718672s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.29863137 +0000 UTC m=+3.490230481" lastFinishedPulling="2026-04-17 14:26:16.264257267 +0000 UTC m=+12.455856380" observedRunningTime="2026-04-17 14:26:25.491229188 +0000 UTC m=+21.682828303" watchObservedRunningTime="2026-04-17 14:26:25.491718672 +0000 UTC m=+21.683317793" Apr 17 14:26:25.526716 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.526664 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7rgwd" podStartSLOduration=4.214321494 podStartE2EDuration="21.526644823s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.295288503 +0000 UTC m=+3.486887602" lastFinishedPulling="2026-04-17 14:26:24.607611819 +0000 UTC m=+20.799210931" observedRunningTime="2026-04-17 14:26:25.526044674 +0000 UTC m=+21.717643795" watchObservedRunningTime="2026-04-17 14:26:25.526644823 +0000 UTC m=+21.718243963" Apr 17 14:26:25.564962 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.564912 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7n9xk" podStartSLOduration=4.301223859 podStartE2EDuration="21.564896091s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.298583117 +0000 UTC m=+3.490182215" lastFinishedPulling="2026-04-17 14:26:24.562255332 +0000 UTC m=+20.753854447" observedRunningTime="2026-04-17 14:26:25.542731845 +0000 UTC m=+21.734330965" watchObservedRunningTime="2026-04-17 14:26:25.564896091 +0000 UTC m=+21.756495211" Apr 17 14:26:25.565093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.565063 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hgwmv" podStartSLOduration=4.291373415 podStartE2EDuration="21.565056615s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.291183272 +0000 UTC m=+3.482782372" lastFinishedPulling="2026-04-17 14:26:24.564866461 +0000 UTC m=+20.756465572" observedRunningTime="2026-04-17 14:26:25.565049875 +0000 UTC m=+21.756648998" watchObservedRunningTime="2026-04-17 14:26:25.565056615 +0000 UTC m=+21.756655735" Apr 17 14:26:25.583484 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.583432 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l87rd" podStartSLOduration=4.35557777 podStartE2EDuration="21.583412953s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.290216865 +0000 UTC m=+3.481815963" lastFinishedPulling="2026-04-17 14:26:24.518052044 +0000 UTC m=+20.709651146" observedRunningTime="2026-04-17 14:26:25.583037697 +0000 UTC m=+21.774636814" watchObservedRunningTime="2026-04-17 14:26:25.583412953 +0000 UTC m=+21.775012076" Apr 17 14:26:25.708357 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:25.708185 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:26:26.324004 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.323820 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:26:25.708353122Z","UUID":"c7bb79b0-e636-4965-bf41-52f19e309c8f","Handler":null,"Name":"","Endpoint":""} Apr 17 14:26:26.326640 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.326614 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:26:26.326782 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.326648 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:26:26.399443 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.399412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:26.399690 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.399470 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:26.399690 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:26.399588 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:26.399832 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:26.399707 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:26.399832 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.399735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:26.399832 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:26.399820 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:26.492790 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.492583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k5l86" event={"ID":"a660c0da-e639-42a0-9f51-34051bd62e42","Type":"ContainerStarted","Data":"28681634e9dd30cd73502923597e591e6d19b585acc8197bea29ca0ee3e0d66d"} Apr 17 14:26:26.495253 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.495224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" event={"ID":"9244d507-858e-449a-af7a-1e8df83a35ae","Type":"ContainerStarted","Data":"94c59135f7ef7195010b7f31a1879e445e6f85a0e955c14ca3d2986781b75153"} Apr 17 14:26:26.499432 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.499049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"81a4bf4911916a67b477a515ee1b4fb91338481d8bdb8b5211e3f284e85133ae"} Apr 17 14:26:26.505994 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:26.505936 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k5l86" podStartSLOduration=5.240925816 podStartE2EDuration="22.505923505s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.296923795 +0000 UTC m=+3.488522909" lastFinishedPulling="2026-04-17 14:26:24.561921495 +0000 UTC m=+20.753520598" observedRunningTime="2026-04-17 14:26:26.505922889 +0000 UTC m=+22.697522020" watchObservedRunningTime="2026-04-17 14:26:26.505923505 +0000 UTC m=+22.697522637" Apr 17 14:26:27.503619 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:27.503568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" event={"ID":"9244d507-858e-449a-af7a-1e8df83a35ae","Type":"ContainerStarted","Data":"768741c288fa9d8c0e2a5692d2f99678b78aa409efe1e39312a48feddd814e12"} Apr 17 14:26:28.399669 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.399630 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:28.399669 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.399650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:28.399909 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.399630 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:28.399909 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:28.399786 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:28.399909 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:28.399817 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:28.399909 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:28.399879 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:28.508459 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.508424 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"08cfa66b980cedccc58083a9a9ce331b01b3e0ecf2612528666095244d9f6502"} Apr 17 14:26:28.623072 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.623042 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:28.623631 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.623607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:28.637347 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:28.637300 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dg7bw" podStartSLOduration=5.499266166 podStartE2EDuration="24.63728629s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.299213035 +0000 UTC m=+3.490812145" lastFinishedPulling="2026-04-17 14:26:26.43723317 +0000 UTC m=+22.628832269" observedRunningTime="2026-04-17 14:26:27.519269944 +0000 UTC m=+23.710869065" watchObservedRunningTime="2026-04-17 14:26:28.63728629 +0000 UTC m=+24.828885409" Apr 17 14:26:29.510776 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:29.510747 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:29.511477 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:29.511459 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7n9xk" Apr 17 14:26:30.398866 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.398648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:30.399045 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.398702 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:30.399045 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:30.398892 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:30.399045 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:30.398962 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:30.399045 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.398721 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:30.399255 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:30.399059 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:30.514390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.514360 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cbba96f-e882-4326-b2b6-5a647245170e" containerID="707a84abe8ea5704858428353127f3f7f27d1ae1a64cb1bda8df1548862959ba" exitCode=0 Apr 17 14:26:30.514915 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.514430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerDied","Data":"707a84abe8ea5704858428353127f3f7f27d1ae1a64cb1bda8df1548862959ba"} Apr 17 14:26:30.517773 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.517750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" event={"ID":"049f13c9-ae95-43ce-9dac-c99d4c07f258","Type":"ContainerStarted","Data":"6001ef34a191f5b42709d72efc60924149d2b91b09265d8069be23873ceb3adf"} Apr 17 14:26:30.518003 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.517953 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:30.518003 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.517998 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:30.518128 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.518011 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:30.532683 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.532663 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:30.532806 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.532728 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:26:30.560529 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:30.560480 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" podStartSLOduration=8.805189859 podStartE2EDuration="26.560466544s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.288791233 +0000 UTC m=+3.480390332" lastFinishedPulling="2026-04-17 14:26:25.044067918 +0000 UTC m=+21.235667017" observedRunningTime="2026-04-17 14:26:30.559036438 +0000 UTC m=+26.750635553" watchObservedRunningTime="2026-04-17 14:26:30.560466544 +0000 UTC m=+26.752065663" Apr 17 14:26:31.833556 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:31.833354 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tr5b8"] Apr 17 14:26:31.834034 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:31.833610 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:31.834034 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:31.833693 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:31.835864 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:31.835840 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fg9xk"] Apr 17 14:26:31.836004 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:31.835949 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:31.836076 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:31.836059 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:31.846635 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:31.846616 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dxl24"] Apr 17 14:26:31.846706 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:31.846696 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:31.846774 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:31.846761 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:32.524599 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:32.524566 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cbba96f-e882-4326-b2b6-5a647245170e" containerID="c86938c3bf8e64152b8529ddee2cd3d819910b39ae98d9e4803320051ff18222" exitCode=0 Apr 17 14:26:32.524792 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:32.524652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerDied","Data":"c86938c3bf8e64152b8529ddee2cd3d819910b39ae98d9e4803320051ff18222"} Apr 17 14:26:33.399647 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:33.399554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:33.400179 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:33.399553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:33.400179 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:33.399681 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:33.400179 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:33.399552 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:33.400179 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:33.399729 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:33.400179 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:33.399793 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:33.528432 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:33.528395 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cbba96f-e882-4326-b2b6-5a647245170e" containerID="9b1e4a0e362b153cc5603a81f8ae935b9793b7a0990bba7aed0fe3c8467cfd6c" exitCode=0 Apr 17 14:26:33.528558 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:33.528462 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerDied","Data":"9b1e4a0e362b153cc5603a81f8ae935b9793b7a0990bba7aed0fe3c8467cfd6c"} Apr 17 14:26:35.398905 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:35.398863 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:35.399647 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:35.398996 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:35.399647 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:35.399009 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:35.399647 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:35.399022 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:35.399647 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:35.399117 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:35.399647 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:35.399194 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:37.399373 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.399148 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:37.399796 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.399233 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:37.399796 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:37.399471 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxl24" podUID="118312db-c4a8-48d8-a9c7-85eeef8a0b7b" Apr 17 14:26:37.399796 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:37.399542 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:26:37.399796 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.399242 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:37.399796 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:37.399624 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fg9xk" podUID="8ce8568a-309e-41ba-9f5d-bd4386d9dcea" Apr 17 14:26:37.658668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.658580 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-199.ec2.internal" event="NodeReady" Apr 17 14:26:37.658845 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.658729 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:26:37.704118 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.704087 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qcjlc"] Apr 17 14:26:37.729433 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.729406 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cwbck"] Apr 17 14:26:37.729604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.729548 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.731698 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.731678 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:26:37.732069 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.732044 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p2qzh\"" Apr 17 14:26:37.732197 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.732099 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:26:37.748746 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.748727 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cwbck"] Apr 17 14:26:37.748746 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.748748 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qcjlc"] Apr 17 14:26:37.748903 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.748840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:37.751237 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.751213 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:26:37.751356 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.751334 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:26:37.751460 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.751207 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:26:37.751624 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.751604 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qz9tk\"" Apr 17 14:26:37.822033 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.821968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.822213 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.822055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7611fae-10f2-4c93-a3ad-2f23c7e94788-tmp-dir\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.822213 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.822099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7611fae-10f2-4c93-a3ad-2f23c7e94788-config-volume\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.822213 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.822130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ds5j\" (UniqueName: \"kubernetes.io/projected/a7611fae-10f2-4c93-a3ad-2f23c7e94788-kube-api-access-5ds5j\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.923335 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7611fae-10f2-4c93-a3ad-2f23c7e94788-tmp-dir\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.923335 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7611fae-10f2-4c93-a3ad-2f23c7e94788-config-volume\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.923546 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ds5j\" (UniqueName: \"kubernetes.io/projected/a7611fae-10f2-4c93-a3ad-2f23c7e94788-kube-api-access-5ds5j\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.923546 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sj8s\" (UniqueName: \"kubernetes.io/projected/57f5fb50-c788-4c19-a249-06bd9abaa213-kube-api-access-5sj8s\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:37.923546 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923480 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.923546 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:37.923714 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:37.923607 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:37.923714 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:37.923658 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.423640595 +0000 UTC m=+34.615239694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:26:37.923714 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.923670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7611fae-10f2-4c93-a3ad-2f23c7e94788-tmp-dir\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.924024 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.924003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7611fae-10f2-4c93-a3ad-2f23c7e94788-config-volume\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:37.935214 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:37.935189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ds5j\" (UniqueName: \"kubernetes.io/projected/a7611fae-10f2-4c93-a3ad-2f23c7e94788-kube-api-access-5ds5j\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:38.024219 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.024182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:38.024456 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.024257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sj8s\" (UniqueName: \"kubernetes.io/projected/57f5fb50-c788-4c19-a249-06bd9abaa213-kube-api-access-5sj8s\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:38.024456 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.024355 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:38.024456 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.024429 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:38.524407662 +0000 UTC m=+34.716006800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:26:38.032784 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.032757 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sj8s\" (UniqueName: \"kubernetes.io/projected/57f5fb50-c788-4c19-a249-06bd9abaa213-kube-api-access-5sj8s\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:38.124614 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.124571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:38.124818 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.124635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:38.124818 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.124746 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:38.124818 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.124811 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:10.124797842 +0000 UTC m=+66.316396949 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:26:38.124996 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.124747 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:38.124996 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.124934 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret podName:118312db-c4a8-48d8-a9c7-85eeef8a0b7b nodeName:}" failed. No retries permitted until 2026-04-17 14:27:10.124917163 +0000 UTC m=+66.316516277 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret") pod "global-pull-secret-syncer-dxl24" (UID: "118312db-c4a8-48d8-a9c7-85eeef8a0b7b") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:26:38.225478 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.225389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:38.225623 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.225555 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:26:38.225623 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.225578 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:26:38.225623 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.225588 2572 projected.go:194] Error preparing data for projected volume kube-api-access-ppkrw for pod openshift-network-diagnostics/network-check-target-fg9xk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:38.225755 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.225652 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw podName:8ce8568a-309e-41ba-9f5d-bd4386d9dcea nodeName:}" failed. No retries permitted until 2026-04-17 14:27:10.225631699 +0000 UTC m=+66.417230800 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ppkrw" (UniqueName: "kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw") pod "network-check-target-fg9xk" (UID: "8ce8568a-309e-41ba-9f5d-bd4386d9dcea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:26:38.427118 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.427080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:38.427527 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.427227 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:38.427527 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.427299 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:39.427275294 +0000 UTC m=+35.618874397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:26:38.527825 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:38.527727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:38.528019 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.527895 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:38.528019 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:38.527969 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:39.527947794 +0000 UTC m=+35.719546893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:26:39.398831 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.398796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:26:39.399012 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.398801 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:26:39.399012 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.398801 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:26:39.401255 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.401236 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:26:39.402171 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.402147 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:26:39.402171 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.402157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqr8w\"" Apr 17 14:26:39.402390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.402182 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:26:39.402390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.402250 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w222h\"" Apr 17 14:26:39.402390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.402257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:26:39.435464 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.435441 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:39.435742 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:39.435528 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:39.435742 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:39.435574 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:41.435561046 +0000 UTC m=+37.627160144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:26:39.536137 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:39.536105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:39.536304 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:39.536203 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:39.536304 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:39.536258 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:41.536244811 +0000 UTC m=+37.727843910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:26:40.545314 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:40.545283 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cbba96f-e882-4326-b2b6-5a647245170e" containerID="9f62ed7f45abeb8b03bf8771722e9c2052b9dd71b8e9c907d2429a58962f5f2f" exitCode=0 Apr 17 14:26:40.545745 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:40.545331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerDied","Data":"9f62ed7f45abeb8b03bf8771722e9c2052b9dd71b8e9c907d2429a58962f5f2f"} Apr 17 14:26:41.449438 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:41.449403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:41.449555 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:41.449519 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:41.449596 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:41.449571 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:45.449557576 +0000 UTC m=+41.641156674 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:26:41.549738 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:41.549706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:41.550148 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:41.549823 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:41.550148 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:41.549871 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:45.54985375 +0000 UTC m=+41.741452848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:26:41.550148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:41.550115 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cbba96f-e882-4326-b2b6-5a647245170e" containerID="617d87b7b2e9418f9b25a7428e6f1e56a95e29bb3d42a3cb523f663b57907c9f" exitCode=0 Apr 17 14:26:41.550317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:41.550154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerDied","Data":"617d87b7b2e9418f9b25a7428e6f1e56a95e29bb3d42a3cb523f663b57907c9f"} Apr 17 14:26:42.554506 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:42.554469 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lw47q" event={"ID":"7cbba96f-e882-4326-b2b6-5a647245170e","Type":"ContainerStarted","Data":"704622c7ed94ab2239a720e333559f33d19072cd5afe34971f35ab2fe9da0b9c"} Apr 17 14:26:42.576061 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:42.576013 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lw47q" podStartSLOduration=6.273086703 podStartE2EDuration="38.575999724s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:26:07.294579826 +0000 UTC m=+3.486178927" lastFinishedPulling="2026-04-17 14:26:39.597492836 +0000 UTC m=+35.789091948" observedRunningTime="2026-04-17 14:26:42.57582654 +0000 UTC m=+38.767425659" watchObservedRunningTime="2026-04-17 14:26:42.575999724 +0000 UTC m=+38.767598868" Apr 17 14:26:45.479145 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:45.479106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:45.479542 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:45.479221 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:45.479542 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:45.479270 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:53.479257414 +0000 UTC m=+49.670856512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:26:45.579624 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:45.579593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:45.579789 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:45.579700 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:45.579789 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:45.579748 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:26:53.579734897 +0000 UTC m=+49.771333995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:26:53.533721 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:53.533684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:26:53.534135 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:53.533828 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:26:53.534135 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:53.533894 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:09.533877501 +0000 UTC m=+65.725476599 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:26:53.634834 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:26:53.634795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:26:53.635027 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:53.634943 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:26:53.635027 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:26:53.635026 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:09.635010473 +0000 UTC m=+65.826609571 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:27:02.546505 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:02.546476 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7sxz5" Apr 17 14:27:09.542039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:09.541960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:27:09.542416 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:09.542125 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:27:09.542416 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:09.542213 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:41.542194162 +0000 UTC m=+97.733793264 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:27:09.642494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:09.642448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:27:09.642675 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:09.642553 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:27:09.642675 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:09.642624 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:41.64260905 +0000 UTC m=+97.834208147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:27:10.146426 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.146388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:27:10.146621 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.146454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:27:10.149008 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.148967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:27:10.149008 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.148992 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:27:10.156963 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:10.156939 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:27:10.157034 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:10.157021 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:14.156999934 +0000 UTC m=+130.348599032 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : secret "metrics-daemon-secret" not found Apr 17 14:27:10.159947 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.159930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/118312db-c4a8-48d8-a9c7-85eeef8a0b7b-original-pull-secret\") pod \"global-pull-secret-syncer-dxl24\" (UID: \"118312db-c4a8-48d8-a9c7-85eeef8a0b7b\") " pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:27:10.247423 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.247389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:27:10.249840 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.249824 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:27:10.260534 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.260517 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:27:10.271635 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.271607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkrw\" (UniqueName: \"kubernetes.io/projected/8ce8568a-309e-41ba-9f5d-bd4386d9dcea-kube-api-access-ppkrw\") pod \"network-check-target-fg9xk\" (UID: \"8ce8568a-309e-41ba-9f5d-bd4386d9dcea\") " pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:27:10.311594 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.311574 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqr8w\"" Apr 17 14:27:10.313349 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.313333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxl24" Apr 17 14:27:10.320469 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.320450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:27:10.489213 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.489184 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fg9xk"] Apr 17 14:27:10.493609 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:27:10.493579 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce8568a_309e_41ba_9f5d_bd4386d9dcea.slice/crio-fec88b30e1b445737892ea5325814d9fc6ad80a4dabc540a4a503f45c00b942a WatchSource:0}: Error finding container fec88b30e1b445737892ea5325814d9fc6ad80a4dabc540a4a503f45c00b942a: Status 404 returned error can't find the container with id fec88b30e1b445737892ea5325814d9fc6ad80a4dabc540a4a503f45c00b942a Apr 17 14:27:10.498885 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.498863 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dxl24"] Apr 17 14:27:10.502497 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:27:10.502474 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod118312db_c4a8_48d8_a9c7_85eeef8a0b7b.slice/crio-66b3d73f107b7c5d3dc85bf0c2fdf6de5e6ee8ae471113d971a4ef5fa3d37f08 WatchSource:0}: Error finding container 66b3d73f107b7c5d3dc85bf0c2fdf6de5e6ee8ae471113d971a4ef5fa3d37f08: Status 404 returned error can't find the container with id 66b3d73f107b7c5d3dc85bf0c2fdf6de5e6ee8ae471113d971a4ef5fa3d37f08 Apr 17 14:27:10.608044 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.608010 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fg9xk" event={"ID":"8ce8568a-309e-41ba-9f5d-bd4386d9dcea","Type":"ContainerStarted","Data":"fec88b30e1b445737892ea5325814d9fc6ad80a4dabc540a4a503f45c00b942a"} Apr 17 14:27:10.609047 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:10.609023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dxl24" event={"ID":"118312db-c4a8-48d8-a9c7-85eeef8a0b7b","Type":"ContainerStarted","Data":"66b3d73f107b7c5d3dc85bf0c2fdf6de5e6ee8ae471113d971a4ef5fa3d37f08"} Apr 17 14:27:15.621140 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:15.621098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fg9xk" event={"ID":"8ce8568a-309e-41ba-9f5d-bd4386d9dcea","Type":"ContainerStarted","Data":"c61eda9eb5757f9f0a3be4dfa2b78eaa91d1b1305327879af9317ce61a84c910"} Apr 17 14:27:15.621651 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:15.621232 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:27:15.622325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:15.622298 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dxl24" event={"ID":"118312db-c4a8-48d8-a9c7-85eeef8a0b7b","Type":"ContainerStarted","Data":"2508fb82327427336308110840e1d738d0250f78c7f2117945343ccf99539b8b"} Apr 17 14:27:15.636295 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:15.636254 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fg9xk" podStartSLOduration=67.312237841 podStartE2EDuration="1m11.636243765s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:27:10.495388085 +0000 UTC m=+66.686987183" lastFinishedPulling="2026-04-17 14:27:14.819394006 +0000 UTC m=+71.010993107" observedRunningTime="2026-04-17 14:27:15.635617466 +0000 UTC m=+71.827216587" watchObservedRunningTime="2026-04-17 14:27:15.636243765 +0000 UTC m=+71.827842884" Apr 17 14:27:15.648328 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:15.648291 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dxl24" podStartSLOduration=66.330183569 podStartE2EDuration="1m10.648280337s" podCreationTimestamp="2026-04-17 14:26:05 +0000 UTC" firstStartedPulling="2026-04-17 14:27:10.504021241 +0000 UTC m=+66.695620342" lastFinishedPulling="2026-04-17 14:27:14.822118011 +0000 UTC m=+71.013717110" observedRunningTime="2026-04-17 14:27:15.648227793 +0000 UTC m=+71.839826910" watchObservedRunningTime="2026-04-17 14:27:15.648280337 +0000 UTC m=+71.839879456" Apr 17 14:27:41.569844 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:41.569814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:27:41.570254 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:41.569930 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:27:41.570254 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:41.570002 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls podName:a7611fae-10f2-4c93-a3ad-2f23c7e94788 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:45.56996936 +0000 UTC m=+161.761568457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls") pod "dns-default-qcjlc" (UID: "a7611fae-10f2-4c93-a3ad-2f23c7e94788") : secret "dns-default-metrics-tls" not found Apr 17 14:27:41.670771 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:41.670680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:27:41.670907 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:41.670772 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:27:41.670907 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:41.670853 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert podName:57f5fb50-c788-4c19-a249-06bd9abaa213 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:45.670836282 +0000 UTC m=+161.862435381 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert") pod "ingress-canary-cwbck" (UID: "57f5fb50-c788-4c19-a249-06bd9abaa213") : secret "canary-serving-cert" not found Apr 17 14:27:46.625989 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:46.625940 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fg9xk" Apr 17 14:27:58.732867 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.732829 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8"] Apr 17 14:27:58.735641 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.735619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" Apr 17 14:27:58.736234 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.736214 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5449587f4d-66vdl"] Apr 17 14:27:58.738125 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.738099 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:27:58.738225 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.738145 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-cx8gv\"" Apr 17 14:27:58.738225 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.738157 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 14:27:58.738830 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.738817 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.740858 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.740841 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-v4zl2\"" Apr 17 14:27:58.741321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.741089 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 14:27:58.741321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.741106 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:27:58.741321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.741178 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 14:27:58.741321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.741093 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 14:27:58.741321 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.741110 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 14:27:58.741603 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.741438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:27:58.744630 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.744613 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8"] Apr 17 14:27:58.751491 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.751460 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5449587f4d-66vdl"] Apr 17 14:27:58.782679 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.782654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj424\" (UniqueName: \"kubernetes.io/projected/5bf7af28-4310-4da2-a1f2-e10c3b867cc6-kube-api-access-zj424\") pod \"volume-data-source-validator-7c6cbb6c87-hzkc8\" (UID: \"5bf7af28-4310-4da2-a1f2-e10c3b867cc6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" Apr 17 14:27:58.843295 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.843264 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn"] Apr 17 14:27:58.849197 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.849170 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk"] Apr 17 14:27:58.849325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.849307 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:58.851644 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.851618 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:27:58.851644 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.851634 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g8nn4\"" Apr 17 14:27:58.852029 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.852014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-666d95ffc5-vtfqx"] Apr 17 14:27:58.852100 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.852084 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 14:27:58.852170 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.852157 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:58.852355 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.852333 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 14:27:58.854654 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.854607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.855340 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.855322 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 14:27:58.855585 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.855568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 14:27:58.855674 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.855616 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:27:58.855746 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.855729 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-sd25t\"" Apr 17 14:27:58.855797 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.855783 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 14:27:58.857848 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.857832 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:27:58.859176 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.859156 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:27:58.859431 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.859415 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:27:58.859534 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.859512 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk"] Apr 17 14:27:58.860213 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.860071 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gbrc2\"" Apr 17 14:27:58.860388 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.860361 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn"] Apr 17 14:27:58.863949 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.863929 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:27:58.869647 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.869628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-666d95ffc5-vtfqx"] Apr 17 14:27:58.882932 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.882914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-stats-auth\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.883039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.882947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj424\" (UniqueName: \"kubernetes.io/projected/5bf7af28-4310-4da2-a1f2-e10c3b867cc6-kube-api-access-zj424\") pod \"volume-data-source-validator-7c6cbb6c87-hzkc8\" (UID: \"5bf7af28-4310-4da2-a1f2-e10c3b867cc6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" Apr 17 14:27:58.883039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.882968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-default-certificate\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.883039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.883004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjsbr\" (UniqueName: \"kubernetes.io/projected/40d17327-7330-4bba-8349-541f00eae77e-kube-api-access-tjsbr\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.883139 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.883038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.883139 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.883067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.899285 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.899263 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj424\" (UniqueName: \"kubernetes.io/projected/5bf7af28-4310-4da2-a1f2-e10c3b867cc6-kube-api-access-zj424\") pod \"volume-data-source-validator-7c6cbb6c87-hzkc8\" (UID: \"5bf7af28-4310-4da2-a1f2-e10c3b867cc6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" Apr 17 14:27:58.937761 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.937735 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7"] Apr 17 14:27:58.940676 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.940662 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" Apr 17 14:27:58.942818 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.942802 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rsnvz\"" Apr 17 14:27:58.948495 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.948475 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7"] Apr 17 14:27:58.984199 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:58.984199 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:58.984325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-certificates\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-trusted-ca\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-stats-auth\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.984325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-image-registry-private-configuration\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984459 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-bound-sa-token\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984459 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984459 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-default-certificate\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.984459 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjsbr\" (UniqueName: \"kubernetes.io/projected/40d17327-7330-4bba-8349-541f00eae77e-kube-api-access-tjsbr\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.984576 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.984620 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:58.984592 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:27:59.48457314 +0000 UTC m=+115.676172254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : configmap references non-existent config key: service-ca.crt Apr 17 14:27:58.984620 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.984722 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-installation-pull-secrets\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984722 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0589e90a-065b-40ad-b760-71ddd18d5f83-ca-trust-extracted\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984820 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:58.984716 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:27:58.984820 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzwsb\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-kube-api-access-xzwsb\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:58.984820 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:58.984767 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:27:59.484752164 +0000 UTC m=+115.676351261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : secret "router-metrics-certs-default" not found Apr 17 14:27:58.984820 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltdq\" (UniqueName: \"kubernetes.io/projected/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-kube-api-access-cltdq\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:58.984820 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7j72\" (UniqueName: \"kubernetes.io/projected/54b6db51-e6d2-4085-892d-0926bc9c6cea-kube-api-access-w7j72\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:58.985041 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.984839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:58.986594 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.986578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-default-certificate\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.986720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.986704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-stats-auth\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:58.995120 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:58.995101 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjsbr\" (UniqueName: \"kubernetes.io/projected/40d17327-7330-4bba-8349-541f00eae77e-kube-api-access-tjsbr\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:59.046014 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.045960 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" Apr 17 14:27:59.085710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.085679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0589e90a-065b-40ad-b760-71ddd18d5f83-ca-trust-extracted\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.085710 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.085713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzwsb\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-kube-api-access-xzwsb\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.085954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.085733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cltdq\" (UniqueName: \"kubernetes.io/projected/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-kube-api-access-cltdq\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.085954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.085760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7j72\" (UniqueName: \"kubernetes.io/projected/54b6db51-e6d2-4085-892d-0926bc9c6cea-kube-api-access-w7j72\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:59.085954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.085944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.086098 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.086098 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:59.086202 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0589e90a-065b-40ad-b760-71ddd18d5f83-ca-trust-extracted\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086202 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzlq\" (UniqueName: \"kubernetes.io/projected/8064c346-7dfa-4cd6-b958-bda83c725a8f-kube-api-access-xmzlq\") pod \"network-check-source-8894fc9bd-ppng7\" (UID: \"8064c346-7dfa-4cd6-b958-bda83c725a8f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" Apr 17 14:27:59.086202 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086177 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-certificates\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086202 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.086183 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:27:59.086394 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-trusted-ca\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086394 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.086247 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls podName:54b6db51-e6d2-4085-892d-0926bc9c6cea nodeName:}" failed. No retries permitted until 2026-04-17 14:27:59.586229418 +0000 UTC m=+115.777828528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tfwqn" (UID: "54b6db51-e6d2-4085-892d-0926bc9c6cea") : secret "samples-operator-tls" not found Apr 17 14:27:59.086394 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-image-registry-private-configuration\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086394 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-bound-sa-token\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086394 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086643 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-installation-pull-secrets\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.086643 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.086643 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.086632 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:27:59.086643 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.086645 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-666d95ffc5-vtfqx: secret "image-registry-tls" not found Apr 17 14:27:59.086835 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.086691 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls podName:0589e90a-065b-40ad-b760-71ddd18d5f83 nodeName:}" failed. No retries permitted until 2026-04-17 14:27:59.586673352 +0000 UTC m=+115.778272461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls") pod "image-registry-666d95ffc5-vtfqx" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83") : secret "image-registry-tls" not found Apr 17 14:27:59.086924 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.086903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-certificates\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.087392 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.087370 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-trusted-ca\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.088703 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.088682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.088799 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.088734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-installation-pull-secrets\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.089065 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.089044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-image-registry-private-configuration\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.096063 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.096039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-bound-sa-token\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.096143 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.096073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltdq\" (UniqueName: \"kubernetes.io/projected/7e8de3fc-6fef-4f2f-983d-9b545b473bf5-kube-api-access-cltdq\") pod \"kube-storage-version-migrator-operator-6769c5d45-297rk\" (UID: \"7e8de3fc-6fef-4f2f-983d-9b545b473bf5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.096280 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.096262 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzwsb\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-kube-api-access-xzwsb\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.097165 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.097140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7j72\" (UniqueName: \"kubernetes.io/projected/54b6db51-e6d2-4085-892d-0926bc9c6cea-kube-api-access-w7j72\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:59.167574 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.167538 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" Apr 17 14:27:59.177130 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.177107 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8"] Apr 17 14:27:59.179240 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:27:59.179207 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf7af28_4310_4da2_a1f2_e10c3b867cc6.slice/crio-e4f1f4690c5d304f0ab83beebfac132dec038c8f06fd0304fc817fe52b327cbb WatchSource:0}: Error finding container e4f1f4690c5d304f0ab83beebfac132dec038c8f06fd0304fc817fe52b327cbb: Status 404 returned error can't find the container with id e4f1f4690c5d304f0ab83beebfac132dec038c8f06fd0304fc817fe52b327cbb Apr 17 14:27:59.187627 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.187595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzlq\" (UniqueName: \"kubernetes.io/projected/8064c346-7dfa-4cd6-b958-bda83c725a8f-kube-api-access-xmzlq\") pod \"network-check-source-8894fc9bd-ppng7\" (UID: \"8064c346-7dfa-4cd6-b958-bda83c725a8f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" Apr 17 14:27:59.194726 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.194702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzlq\" (UniqueName: \"kubernetes.io/projected/8064c346-7dfa-4cd6-b958-bda83c725a8f-kube-api-access-xmzlq\") pod \"network-check-source-8894fc9bd-ppng7\" (UID: \"8064c346-7dfa-4cd6-b958-bda83c725a8f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" Apr 17 14:27:59.249587 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.249555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" Apr 17 14:27:59.281523 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.281494 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk"] Apr 17 14:27:59.284466 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:27:59.284437 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8de3fc_6fef_4f2f_983d_9b545b473bf5.slice/crio-e1bca0ccf3796cb3c472b570d2dee882121ab785d87a1ea72e6efc965f19b067 WatchSource:0}: Error finding container e1bca0ccf3796cb3c472b570d2dee882121ab785d87a1ea72e6efc965f19b067: Status 404 returned error can't find the container with id e1bca0ccf3796cb3c472b570d2dee882121ab785d87a1ea72e6efc965f19b067 Apr 17 14:27:59.362837 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.362808 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7"] Apr 17 14:27:59.365399 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:27:59.365376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8064c346_7dfa_4cd6_b958_bda83c725a8f.slice/crio-8daea6f57cf0efa665a3db50a78f11143100d968160f90925bc5c0c7da5e7b63 WatchSource:0}: Error finding container 8daea6f57cf0efa665a3db50a78f11143100d968160f90925bc5c0c7da5e7b63: Status 404 returned error can't find the container with id 8daea6f57cf0efa665a3db50a78f11143100d968160f90925bc5c0c7da5e7b63 Apr 17 14:27:59.490365 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.490266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:59.490365 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.490307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:27:59.490581 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.490456 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:00.490432216 +0000 UTC m=+116.682031321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : configmap references non-existent config key: service-ca.crt Apr 17 14:27:59.490581 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.490458 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:27:59.490581 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.490506 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:00.490494201 +0000 UTC m=+116.682093299 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : secret "router-metrics-certs-default" not found Apr 17 14:27:59.591430 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.591392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:27:59.591613 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.591469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:27:59.591613 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.591542 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:27:59.591613 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.591608 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls podName:54b6db51-e6d2-4085-892d-0926bc9c6cea nodeName:}" failed. No retries permitted until 2026-04-17 14:28:00.591592085 +0000 UTC m=+116.783191187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tfwqn" (UID: "54b6db51-e6d2-4085-892d-0926bc9c6cea") : secret "samples-operator-tls" not found Apr 17 14:27:59.591613 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.591608 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:27:59.591802 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.591621 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-666d95ffc5-vtfqx: secret "image-registry-tls" not found Apr 17 14:27:59.591802 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:27:59.591663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls podName:0589e90a-065b-40ad-b760-71ddd18d5f83 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:00.591652079 +0000 UTC m=+116.783251177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls") pod "image-registry-666d95ffc5-vtfqx" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83") : secret "image-registry-tls" not found Apr 17 14:27:59.711796 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.711755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" event={"ID":"8064c346-7dfa-4cd6-b958-bda83c725a8f","Type":"ContainerStarted","Data":"3cf57da7b6030ba819d93462f82599401724e28badc6f0fa91bc5ff4a4a4675d"} Apr 17 14:27:59.711796 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.711796 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" event={"ID":"8064c346-7dfa-4cd6-b958-bda83c725a8f","Type":"ContainerStarted","Data":"8daea6f57cf0efa665a3db50a78f11143100d968160f90925bc5c0c7da5e7b63"} Apr 17 14:27:59.712816 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.712793 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" event={"ID":"5bf7af28-4310-4da2-a1f2-e10c3b867cc6","Type":"ContainerStarted","Data":"e4f1f4690c5d304f0ab83beebfac132dec038c8f06fd0304fc817fe52b327cbb"} Apr 17 14:27:59.713803 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.713781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" event={"ID":"7e8de3fc-6fef-4f2f-983d-9b545b473bf5","Type":"ContainerStarted","Data":"e1bca0ccf3796cb3c472b570d2dee882121ab785d87a1ea72e6efc965f19b067"} Apr 17 14:27:59.727611 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:27:59.727561 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ppng7" podStartSLOduration=1.727545341 podStartE2EDuration="1.727545341s" podCreationTimestamp="2026-04-17 14:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:27:59.725953633 +0000 UTC m=+115.917552752" watchObservedRunningTime="2026-04-17 14:27:59.727545341 +0000 UTC m=+115.919144456" Apr 17 14:28:00.500578 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:00.499745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:00.500578 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:00.499799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:00.500578 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.499943 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:28:00.500578 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.500190 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:02.500168424 +0000 UTC m=+118.691767539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : configmap references non-existent config key: service-ca.crt Apr 17 14:28:00.501299 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.501269 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:02.501240807 +0000 UTC m=+118.692839910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : secret "router-metrics-certs-default" not found Apr 17 14:28:00.600214 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:00.600185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:00.600337 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:00.600304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:28:00.600415 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.600368 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:28:00.600415 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.600398 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-666d95ffc5-vtfqx: secret "image-registry-tls" not found Apr 17 14:28:00.600516 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.600431 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:28:00.600516 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.600477 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls podName:0589e90a-065b-40ad-b760-71ddd18d5f83 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:02.60045398 +0000 UTC m=+118.792053088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls") pod "image-registry-666d95ffc5-vtfqx" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83") : secret "image-registry-tls" not found Apr 17 14:28:00.600516 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:00.600498 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls podName:54b6db51-e6d2-4085-892d-0926bc9c6cea nodeName:}" failed. No retries permitted until 2026-04-17 14:28:02.600488478 +0000 UTC m=+118.792087591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tfwqn" (UID: "54b6db51-e6d2-4085-892d-0926bc9c6cea") : secret "samples-operator-tls" not found Apr 17 14:28:00.717545 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:00.717500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" event={"ID":"5bf7af28-4310-4da2-a1f2-e10c3b867cc6","Type":"ContainerStarted","Data":"46afb80b829fc36c8e20ca2778d53421ca60a432cd45e4a923118f702b19b647"} Apr 17 14:28:00.734075 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:00.734021 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-hzkc8" podStartSLOduration=1.355519144 podStartE2EDuration="2.73400279s" podCreationTimestamp="2026-04-17 14:27:58 +0000 UTC" firstStartedPulling="2026-04-17 14:27:59.180853382 +0000 UTC m=+115.372452479" lastFinishedPulling="2026-04-17 14:28:00.559337024 +0000 UTC m=+116.750936125" observedRunningTime="2026-04-17 14:28:00.732303616 +0000 UTC m=+116.923902738" watchObservedRunningTime="2026-04-17 14:28:00.73400279 +0000 UTC m=+116.925601912" Apr 17 14:28:01.720420 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:01.720386 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" event={"ID":"7e8de3fc-6fef-4f2f-983d-9b545b473bf5","Type":"ContainerStarted","Data":"544af9d819921936defa1c0443de27f6ecc666b59066aaa9c52d6719e4bb14ab"} Apr 17 14:28:01.735366 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:01.735310 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" podStartSLOduration=1.889104551 podStartE2EDuration="3.735296554s" podCreationTimestamp="2026-04-17 14:27:58 +0000 UTC" firstStartedPulling="2026-04-17 14:27:59.286413941 +0000 UTC m=+115.478013052" lastFinishedPulling="2026-04-17 14:28:01.13260594 +0000 UTC m=+117.324205055" observedRunningTime="2026-04-17 14:28:01.733858963 +0000 UTC m=+117.925458083" watchObservedRunningTime="2026-04-17 14:28:01.735296554 +0000 UTC m=+117.926895673" Apr 17 14:28:02.518381 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:02.518328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:02.518381 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:02.518394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:02.518606 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.518496 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:06.518477001 +0000 UTC m=+122.710076119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : configmap references non-existent config key: service-ca.crt Apr 17 14:28:02.518606 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.518532 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:28:02.518606 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.518588 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:06.518572409 +0000 UTC m=+122.710171514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : secret "router-metrics-certs-default" not found Apr 17 14:28:02.619050 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:02.618998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:28:02.619248 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:02.619078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:02.619248 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.619096 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:28:02.619248 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.619174 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls podName:54b6db51-e6d2-4085-892d-0926bc9c6cea nodeName:}" failed. No retries permitted until 2026-04-17 14:28:06.619154742 +0000 UTC m=+122.810753841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tfwqn" (UID: "54b6db51-e6d2-4085-892d-0926bc9c6cea") : secret "samples-operator-tls" not found Apr 17 14:28:02.619248 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.619198 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:28:02.619248 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.619212 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-666d95ffc5-vtfqx: secret "image-registry-tls" not found Apr 17 14:28:02.619488 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:02.619260 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls podName:0589e90a-065b-40ad-b760-71ddd18d5f83 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:06.619246085 +0000 UTC m=+122.810845213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls") pod "image-registry-666d95ffc5-vtfqx" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83") : secret "image-registry-tls" not found Apr 17 14:28:06.552294 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:06.552255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:06.552294 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:06.552294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:06.552706 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.552391 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 14:28:06.552706 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.552437 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:14.552417619 +0000 UTC m=+130.744016728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : configmap references non-existent config key: service-ca.crt Apr 17 14:28:06.552706 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.552461 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs podName:40d17327-7330-4bba-8349-541f00eae77e nodeName:}" failed. No retries permitted until 2026-04-17 14:28:14.552454047 +0000 UTC m=+130.744053145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs") pod "router-default-5449587f4d-66vdl" (UID: "40d17327-7330-4bba-8349-541f00eae77e") : secret "router-metrics-certs-default" not found Apr 17 14:28:06.652635 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:06.652601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:28:06.652799 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:06.652657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:06.652799 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.652770 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 14:28:06.652926 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.652834 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:28:06.652926 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.652845 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-666d95ffc5-vtfqx: secret "image-registry-tls" not found Apr 17 14:28:06.652926 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.652847 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls podName:54b6db51-e6d2-4085-892d-0926bc9c6cea nodeName:}" failed. No retries permitted until 2026-04-17 14:28:14.652825671 +0000 UTC m=+130.844424770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-tfwqn" (UID: "54b6db51-e6d2-4085-892d-0926bc9c6cea") : secret "samples-operator-tls" not found Apr 17 14:28:06.652926 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:06.652882 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls podName:0589e90a-065b-40ad-b760-71ddd18d5f83 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:14.652869349 +0000 UTC m=+130.844468447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls") pod "image-registry-666d95ffc5-vtfqx" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83") : secret "image-registry-tls" not found Apr 17 14:28:06.863453 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:06.863427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l87rd_4fe5daab-b27c-4fc1-88c0-72d89cdccb01/dns-node-resolver/0.log" Apr 17 14:28:07.663801 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:07.663772 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h8bbt_6cb28525-d291-4976-8923-d29d0747a7dd/node-ca/0.log" Apr 17 14:28:09.466179 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:09.466148 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-297rk_7e8de3fc-6fef-4f2f-983d-9b545b473bf5/kube-storage-version-migrator-operator/0.log" Apr 17 14:28:14.216572 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.216541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:28:14.217009 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:14.216679 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:28:14.217009 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:14.216740 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs podName:85225238-59be-4245-9db5-4b8fa923bf28 nodeName:}" failed. No retries permitted until 2026-04-17 14:30:16.21672465 +0000 UTC m=+252.408323748 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs") pod "network-metrics-daemon-tr5b8" (UID: "85225238-59be-4245-9db5-4b8fa923bf28") : secret "metrics-daemon-secret" not found Apr 17 14:28:14.619217 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.619189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:14.619217 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.619222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:14.619785 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.619762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40d17327-7330-4bba-8349-541f00eae77e-service-ca-bundle\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:14.621629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.621606 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40d17327-7330-4bba-8349-541f00eae77e-metrics-certs\") pod \"router-default-5449587f4d-66vdl\" (UID: \"40d17327-7330-4bba-8349-541f00eae77e\") " pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:14.654760 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.654738 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-v4zl2\"" Apr 17 14:28:14.662655 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.662641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:14.719690 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.719655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:14.719842 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.719785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:28:14.722804 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.722727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/54b6db51-e6d2-4085-892d-0926bc9c6cea-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-tfwqn\" (UID: \"54b6db51-e6d2-4085-892d-0926bc9c6cea\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:28:14.722804 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.722729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"image-registry-666d95ffc5-vtfqx\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:14.763056 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.763032 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g8nn4\"" Apr 17 14:28:14.770993 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.770951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" Apr 17 14:28:14.775674 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.775657 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gbrc2\"" Apr 17 14:28:14.784066 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.784042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:14.784741 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.784640 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5449587f4d-66vdl"] Apr 17 14:28:14.788897 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:14.788872 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40d17327_7330_4bba_8349_541f00eae77e.slice/crio-f5184653f0693f970e1635a8f6d33a79a7827920943de517b9df34822c694f0e WatchSource:0}: Error finding container f5184653f0693f970e1635a8f6d33a79a7827920943de517b9df34822c694f0e: Status 404 returned error can't find the container with id f5184653f0693f970e1635a8f6d33a79a7827920943de517b9df34822c694f0e Apr 17 14:28:14.901395 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.901251 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn"] Apr 17 14:28:14.920853 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:14.920820 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-666d95ffc5-vtfqx"] Apr 17 14:28:14.923298 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:14.923268 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0589e90a_065b_40ad_b760_71ddd18d5f83.slice/crio-457236770e1c25de78eebb4297c1e51ae372f2ce4a0b7af2326e9ff48285b33e WatchSource:0}: Error finding container 457236770e1c25de78eebb4297c1e51ae372f2ce4a0b7af2326e9ff48285b33e: Status 404 returned error can't find the container with id 457236770e1c25de78eebb4297c1e51ae372f2ce4a0b7af2326e9ff48285b33e Apr 17 14:28:15.749185 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.749133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5449587f4d-66vdl" event={"ID":"40d17327-7330-4bba-8349-541f00eae77e","Type":"ContainerStarted","Data":"6c9c279d7587863e7d9f1201a0d165d4044b892bdffb0852ffd05182dd3a2e31"} Apr 17 14:28:15.749185 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.749190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5449587f4d-66vdl" event={"ID":"40d17327-7330-4bba-8349-541f00eae77e","Type":"ContainerStarted","Data":"f5184653f0693f970e1635a8f6d33a79a7827920943de517b9df34822c694f0e"} Apr 17 14:28:15.750517 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.750488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" event={"ID":"0589e90a-065b-40ad-b760-71ddd18d5f83","Type":"ContainerStarted","Data":"116e34a911231f3fe88af22c42cef216a06c0cd9137ff6e4cadf2e69d8546f0e"} Apr 17 14:28:15.750646 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.750523 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" event={"ID":"0589e90a-065b-40ad-b760-71ddd18d5f83","Type":"ContainerStarted","Data":"457236770e1c25de78eebb4297c1e51ae372f2ce4a0b7af2326e9ff48285b33e"} Apr 17 14:28:15.750646 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.750614 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:15.751548 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.751524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" event={"ID":"54b6db51-e6d2-4085-892d-0926bc9c6cea","Type":"ContainerStarted","Data":"2027b365aef7ff0404a34295c65be5c5a65fce3043b8ae7d3ec005bd1a503745"} Apr 17 14:28:15.766671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.766632 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5449587f4d-66vdl" podStartSLOduration=17.766622464 podStartE2EDuration="17.766622464s" podCreationTimestamp="2026-04-17 14:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:28:15.766072637 +0000 UTC m=+131.957671762" watchObservedRunningTime="2026-04-17 14:28:15.766622464 +0000 UTC m=+131.958221624" Apr 17 14:28:15.783902 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:15.783867 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" podStartSLOduration=17.783856672 podStartE2EDuration="17.783856672s" podCreationTimestamp="2026-04-17 14:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:28:15.783071695 +0000 UTC m=+131.974670816" watchObservedRunningTime="2026-04-17 14:28:15.783856672 +0000 UTC m=+131.975455769" Apr 17 14:28:16.663202 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:16.663168 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:16.666150 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:16.666128 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:16.755087 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:16.755057 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:16.756409 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:16.756378 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5449587f4d-66vdl" Apr 17 14:28:17.758469 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:17.758429 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" event={"ID":"54b6db51-e6d2-4085-892d-0926bc9c6cea","Type":"ContainerStarted","Data":"4120a230c2242f43352f78f0c6b87e061b812e78d1084a71699945a860f4969a"} Apr 17 14:28:17.758469 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:17.758473 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" event={"ID":"54b6db51-e6d2-4085-892d-0926bc9c6cea","Type":"ContainerStarted","Data":"fb5f08a6a4bd430c6bb6410012eceb4a37f4449537889dfba29f5e146fbaed3f"} Apr 17 14:28:17.774330 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:17.774280 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-tfwqn" podStartSLOduration=17.854946034 podStartE2EDuration="19.774265506s" podCreationTimestamp="2026-04-17 14:27:58 +0000 UTC" firstStartedPulling="2026-04-17 14:28:14.9381474 +0000 UTC m=+131.129746501" lastFinishedPulling="2026-04-17 14:28:16.857466872 +0000 UTC m=+133.049065973" observedRunningTime="2026-04-17 14:28:17.773211852 +0000 UTC m=+133.964810975" watchObservedRunningTime="2026-04-17 14:28:17.774265506 +0000 UTC m=+133.965864626" Apr 17 14:28:27.557270 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.557229 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-r9rw5"] Apr 17 14:28:27.561317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.561300 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.563909 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.563876 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:28:27.563909 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.563889 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:28:27.564074 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.563908 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:28:27.564894 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.564881 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wp7x7\"" Apr 17 14:28:27.565005 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.564967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:28:27.574594 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.574568 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r9rw5"] Apr 17 14:28:27.624934 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.624908 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-666d95ffc5-vtfqx"] Apr 17 14:28:27.661893 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.661862 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d97df5bf7-xlprx"] Apr 17 14:28:27.666056 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.666040 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.674990 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.674961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d97df5bf7-xlprx"] Apr 17 14:28:27.716341 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.716313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-data-volume\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.716482 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.716374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtb5w\" (UniqueName: \"kubernetes.io/projected/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-kube-api-access-dtb5w\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.716538 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.716492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.716580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.716536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.716580 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.716559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-crio-socket\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817365 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-crio-socket\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817365 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817308 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-registry-tls\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817365 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bff4a2a5-e4e7-440f-8284-b551ea59635d-registry-certificates\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817365 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-data-volume\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44r7v\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-kube-api-access-44r7v\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bff4a2a5-e4e7-440f-8284-b551ea59635d-installation-pull-secrets\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-crio-socket\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtb5w\" (UniqueName: \"kubernetes.io/projected/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-kube-api-access-dtb5w\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bff4a2a5-e4e7-440f-8284-b551ea59635d-image-registry-private-configuration\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bff4a2a5-e4e7-440f-8284-b551ea59635d-trusted-ca\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bff4a2a5-e4e7-440f-8284-b551ea59635d-ca-trust-extracted\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817558 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.817671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-bound-sa-token\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.818048 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.817751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-data-volume\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.818082 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.818063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.819769 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.819753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.826889 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.826870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtb5w\" (UniqueName: \"kubernetes.io/projected/7ba19e6f-9ed4-48ef-b5e5-6417d0306a13-kube-api-access-dtb5w\") pod \"insights-runtime-extractor-r9rw5\" (UID: \"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13\") " pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.871038 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.870998 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r9rw5" Apr 17 14:28:27.918893 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.918854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bff4a2a5-e4e7-440f-8284-b551ea59635d-ca-trust-extracted\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.918901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-bound-sa-token\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.918931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-registry-tls\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.918970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bff4a2a5-e4e7-440f-8284-b551ea59635d-registry-certificates\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919039 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.919032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44r7v\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-kube-api-access-44r7v\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919282 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.919078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bff4a2a5-e4e7-440f-8284-b551ea59635d-installation-pull-secrets\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919282 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.919122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bff4a2a5-e4e7-440f-8284-b551ea59635d-image-registry-private-configuration\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919282 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.919156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bff4a2a5-e4e7-440f-8284-b551ea59635d-trusted-ca\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.919443 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.919365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bff4a2a5-e4e7-440f-8284-b551ea59635d-ca-trust-extracted\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.920003 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.919965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bff4a2a5-e4e7-440f-8284-b551ea59635d-trusted-ca\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.920398 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.920279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bff4a2a5-e4e7-440f-8284-b551ea59635d-registry-certificates\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.922782 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.922735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-registry-tls\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.923009 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.922963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bff4a2a5-e4e7-440f-8284-b551ea59635d-installation-pull-secrets\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.923091 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.923063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bff4a2a5-e4e7-440f-8284-b551ea59635d-image-registry-private-configuration\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.926877 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.926857 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-bound-sa-token\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.927628 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.927597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44r7v\" (UniqueName: \"kubernetes.io/projected/bff4a2a5-e4e7-440f-8284-b551ea59635d-kube-api-access-44r7v\") pod \"image-registry-d97df5bf7-xlprx\" (UID: \"bff4a2a5-e4e7-440f-8284-b551ea59635d\") " pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.974309 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.974280 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:27.989612 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:27.989585 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r9rw5"] Apr 17 14:28:27.994027 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:27.993993 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba19e6f_9ed4_48ef_b5e5_6417d0306a13.slice/crio-26673bb8604d85ebc20f85fe5f4998973a739d7ac3e291db401a1a6d20866e18 WatchSource:0}: Error finding container 26673bb8604d85ebc20f85fe5f4998973a739d7ac3e291db401a1a6d20866e18: Status 404 returned error can't find the container with id 26673bb8604d85ebc20f85fe5f4998973a739d7ac3e291db401a1a6d20866e18 Apr 17 14:28:28.118204 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.118104 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d97df5bf7-xlprx"] Apr 17 14:28:28.120443 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:28.120416 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff4a2a5_e4e7_440f_8284_b551ea59635d.slice/crio-8879266b2fb8160a9437d2a7c784ebf7a159f4519d1c76484ac48d5867c50e7c WatchSource:0}: Error finding container 8879266b2fb8160a9437d2a7c784ebf7a159f4519d1c76484ac48d5867c50e7c: Status 404 returned error can't find the container with id 8879266b2fb8160a9437d2a7c784ebf7a159f4519d1c76484ac48d5867c50e7c Apr 17 14:28:28.790138 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.790110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" event={"ID":"bff4a2a5-e4e7-440f-8284-b551ea59635d","Type":"ContainerStarted","Data":"079b7978723d2446da9821a7e72bef2237effe5763a825a82da8adb448260112"} Apr 17 14:28:28.790449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.790148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" event={"ID":"bff4a2a5-e4e7-440f-8284-b551ea59635d","Type":"ContainerStarted","Data":"8879266b2fb8160a9437d2a7c784ebf7a159f4519d1c76484ac48d5867c50e7c"} Apr 17 14:28:28.790449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.790167 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:28.791323 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.791304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r9rw5" event={"ID":"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13","Type":"ContainerStarted","Data":"c44d0dfe6fe112981b1a68393245548a8e70a5252ae9bbd4ff2919fbd805f04b"} Apr 17 14:28:28.791380 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.791329 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r9rw5" event={"ID":"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13","Type":"ContainerStarted","Data":"26673bb8604d85ebc20f85fe5f4998973a739d7ac3e291db401a1a6d20866e18"} Apr 17 14:28:28.808449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:28.808402 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" podStartSLOduration=1.80838604 podStartE2EDuration="1.80838604s" podCreationTimestamp="2026-04-17 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:28:28.807288095 +0000 UTC m=+144.998887216" watchObservedRunningTime="2026-04-17 14:28:28.80838604 +0000 UTC m=+144.999985142" Apr 17 14:28:29.795349 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:29.795311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r9rw5" event={"ID":"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13","Type":"ContainerStarted","Data":"bdcd27b8cd6f934f18cd0cb6f8319af3b06d3129e953c3a66c4dbb9e0497d451"} Apr 17 14:28:30.799103 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:30.799071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r9rw5" event={"ID":"7ba19e6f-9ed4-48ef-b5e5-6417d0306a13","Type":"ContainerStarted","Data":"caf076acca80c0289bbd96656fbf1d4f87e3a34e4392ace7adc1cc78b09af1b4"} Apr 17 14:28:30.815562 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:30.815517 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-r9rw5" podStartSLOduration=1.648009816 podStartE2EDuration="3.815503374s" podCreationTimestamp="2026-04-17 14:28:27 +0000 UTC" firstStartedPulling="2026-04-17 14:28:28.080542459 +0000 UTC m=+144.272141557" lastFinishedPulling="2026-04-17 14:28:30.248036004 +0000 UTC m=+146.439635115" observedRunningTime="2026-04-17 14:28:30.814399615 +0000 UTC m=+147.005998747" watchObservedRunningTime="2026-04-17 14:28:30.815503374 +0000 UTC m=+147.007102494" Apr 17 14:28:37.632043 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:37.632007 2572 patch_prober.go:28] interesting pod/image-registry-666d95ffc5-vtfqx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:28:37.632439 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:37.632070 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" podUID="0589e90a-065b-40ad-b760-71ddd18d5f83" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:28:39.902411 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.902381 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-htmfm"] Apr 17 14:28:39.907110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.907089 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qm6nf"] Apr 17 14:28:39.907248 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.907230 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:39.910516 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.910494 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:39.911225 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911194 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 14:28:39.911225 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911208 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:28:39.911388 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911208 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:28:39.911388 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911255 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-w4wvp\"" Apr 17 14:28:39.911388 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911270 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 14:28:39.911388 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911205 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 14:28:39.911702 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.911686 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:28:39.913677 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.913656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:28:39.913677 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.913671 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:28:39.913928 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.913910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2m46t\"" Apr 17 14:28:39.914056 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.913929 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:28:39.923195 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:39.923177 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-htmfm"] Apr 17 14:28:40.014508 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6dad2fc-6268-43d6-8662-8a6c266e601f-metrics-client-ca\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.014668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.014668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-wtmp\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.014668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-sys\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.014668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.014668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6mpr\" (UniqueName: \"kubernetes.io/projected/f6dad2fc-6268-43d6-8662-8a6c266e601f-kube-api-access-x6mpr\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.014827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-accelerators-collector-config\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.014827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.014827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.014827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d9c2287a-52d3-4ad1-91d5-cb34daf99223-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.014827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-textfile\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.015000 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-root\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.015000 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-tls\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.015000 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c2287a-52d3-4ad1-91d5-cb34daf99223-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.015000 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.014915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp4h\" (UniqueName: \"kubernetes.io/projected/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-api-access-vhp4h\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.116296 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c2287a-52d3-4ad1-91d5-cb34daf99223-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.116296 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhp4h\" (UniqueName: \"kubernetes.io/projected/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-api-access-vhp4h\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.116480 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6dad2fc-6268-43d6-8662-8a6c266e601f-metrics-client-ca\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116480 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.116480 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:40.116440 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 14:28:40.116600 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:40.116505 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-tls podName:d9c2287a-52d3-4ad1-91d5-cb34daf99223 nodeName:}" failed. No retries permitted until 2026-04-17 14:28:40.616487053 +0000 UTC m=+156.808086155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-htmfm" (UID: "d9c2287a-52d3-4ad1-91d5-cb34daf99223") : secret "kube-state-metrics-tls" not found Apr 17 14:28:40.116600 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-wtmp\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116600 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-sys\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116600 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6mpr\" (UniqueName: \"kubernetes.io/projected/f6dad2fc-6268-43d6-8662-8a6c266e601f-kube-api-access-x6mpr\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-sys\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-accelerators-collector-config\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-wtmp\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.116847 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d9c2287a-52d3-4ad1-91d5-cb34daf99223-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-textfile\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-root\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6dad2fc-6268-43d6-8662-8a6c266e601f-metrics-client-ca\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.116940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-tls\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.117063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9c2287a-52d3-4ad1-91d5-cb34daf99223-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.117133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f6dad2fc-6268-43d6-8662-8a6c266e601f-root\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.117183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-accelerators-collector-config\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.117277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-textfile\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.117494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.117295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.117494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.117306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d9c2287a-52d3-4ad1-91d5-cb34daf99223-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.119182 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.119154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.119276 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.119219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f6dad2fc-6268-43d6-8662-8a6c266e601f-node-exporter-tls\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.119342 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.119325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.127771 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.127730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhp4h\" (UniqueName: \"kubernetes.io/projected/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-api-access-vhp4h\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.128652 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.128629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6mpr\" (UniqueName: \"kubernetes.io/projected/f6dad2fc-6268-43d6-8662-8a6c266e601f-kube-api-access-x6mpr\") pod \"node-exporter-qm6nf\" (UID: \"f6dad2fc-6268-43d6-8662-8a6c266e601f\") " pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.224303 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.224220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qm6nf" Apr 17 14:28:40.233769 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:40.233742 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6dad2fc_6268_43d6_8662_8a6c266e601f.slice/crio-3617d2ebe8009bb7433e75006e94c7379c2541f1913e2d5e459b7a1879a9fbe3 WatchSource:0}: Error finding container 3617d2ebe8009bb7433e75006e94c7379c2541f1913e2d5e459b7a1879a9fbe3: Status 404 returned error can't find the container with id 3617d2ebe8009bb7433e75006e94c7379c2541f1913e2d5e459b7a1879a9fbe3 Apr 17 14:28:40.620345 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.620314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.622455 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.622436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9c2287a-52d3-4ad1-91d5-cb34daf99223-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-htmfm\" (UID: \"d9c2287a-52d3-4ad1-91d5-cb34daf99223\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.741287 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:40.741240 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qcjlc" podUID="a7611fae-10f2-4c93-a3ad-2f23c7e94788" Apr 17 14:28:40.759350 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:40.759313 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cwbck" podUID="57f5fb50-c788-4c19-a249-06bd9abaa213" Apr 17 14:28:40.817728 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.817693 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" Apr 17 14:28:40.825439 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.825414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qm6nf" event={"ID":"f6dad2fc-6268-43d6-8662-8a6c266e601f","Type":"ContainerStarted","Data":"3617d2ebe8009bb7433e75006e94c7379c2541f1913e2d5e459b7a1879a9fbe3"} Apr 17 14:28:40.825564 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.825426 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcjlc" Apr 17 14:28:40.942226 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:40.942191 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-htmfm"] Apr 17 14:28:40.994069 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:40.994040 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c2287a_52d3_4ad1_91d5_cb34daf99223.slice/crio-4326f371b132051769ce97c146a8d3fd3a47dc9d52611fcaa4e084dde0e6904a WatchSource:0}: Error finding container 4326f371b132051769ce97c146a8d3fd3a47dc9d52611fcaa4e084dde0e6904a: Status 404 returned error can't find the container with id 4326f371b132051769ce97c146a8d3fd3a47dc9d52611fcaa4e084dde0e6904a Apr 17 14:28:41.829695 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:41.829661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" event={"ID":"d9c2287a-52d3-4ad1-91d5-cb34daf99223","Type":"ContainerStarted","Data":"4326f371b132051769ce97c146a8d3fd3a47dc9d52611fcaa4e084dde0e6904a"} Apr 17 14:28:41.830875 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:41.830851 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6dad2fc-6268-43d6-8662-8a6c266e601f" containerID="04a321e1e2cd45ef1e66275b621ab7f34d22037d1164fd9c712b8fe80ab10bfa" exitCode=0 Apr 17 14:28:41.830994 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:41.830886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qm6nf" event={"ID":"f6dad2fc-6268-43d6-8662-8a6c266e601f","Type":"ContainerDied","Data":"04a321e1e2cd45ef1e66275b621ab7f34d22037d1164fd9c712b8fe80ab10bfa"} Apr 17 14:28:42.417955 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:28:42.417918 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tr5b8" podUID="85225238-59be-4245-9db5-4b8fa923bf28" Apr 17 14:28:42.835826 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.835793 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qm6nf" event={"ID":"f6dad2fc-6268-43d6-8662-8a6c266e601f","Type":"ContainerStarted","Data":"9b2e94e94cb7896494530e91df0eb7ee18cf7cee061f2fd0616f33fe24e6f93b"} Apr 17 14:28:42.835826 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.835829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qm6nf" event={"ID":"f6dad2fc-6268-43d6-8662-8a6c266e601f","Type":"ContainerStarted","Data":"0aa183925a07eb4cee273095ff74674d5883d296cbac74d0076bb36fce701775"} Apr 17 14:28:42.837742 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.837717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" event={"ID":"d9c2287a-52d3-4ad1-91d5-cb34daf99223","Type":"ContainerStarted","Data":"286a60cbe0755bf6c407d9c404c18efe983abf9429fa505d8d76a4747dc65e8c"} Apr 17 14:28:42.837860 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.837748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" event={"ID":"d9c2287a-52d3-4ad1-91d5-cb34daf99223","Type":"ContainerStarted","Data":"c3f352a25041a34b50a3b413cfe05085bcc2ffdf8e27457b8bc91be4481e852e"} Apr 17 14:28:42.837860 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.837757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" event={"ID":"d9c2287a-52d3-4ad1-91d5-cb34daf99223","Type":"ContainerStarted","Data":"abef10d3a79bc284eee9589ca93dc20c6358ba9e7d449d4d13aedc666f78c1bb"} Apr 17 14:28:42.854803 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.854755 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qm6nf" podStartSLOduration=3.05297872 podStartE2EDuration="3.854742044s" podCreationTimestamp="2026-04-17 14:28:39 +0000 UTC" firstStartedPulling="2026-04-17 14:28:40.235342821 +0000 UTC m=+156.426941918" lastFinishedPulling="2026-04-17 14:28:41.037106144 +0000 UTC m=+157.228705242" observedRunningTime="2026-04-17 14:28:42.85327018 +0000 UTC m=+159.044869300" watchObservedRunningTime="2026-04-17 14:28:42.854742044 +0000 UTC m=+159.046341164" Apr 17 14:28:42.872242 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:42.872186 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-htmfm" podStartSLOduration=2.691338076 podStartE2EDuration="3.872169602s" podCreationTimestamp="2026-04-17 14:28:39 +0000 UTC" firstStartedPulling="2026-04-17 14:28:40.995879354 +0000 UTC m=+157.187478452" lastFinishedPulling="2026-04-17 14:28:42.17671088 +0000 UTC m=+158.368309978" observedRunningTime="2026-04-17 14:28:42.871438455 +0000 UTC m=+159.063037610" watchObservedRunningTime="2026-04-17 14:28:42.872169602 +0000 UTC m=+159.063768722" Apr 17 14:28:45.660014 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:45.659959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:28:45.662232 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:45.662211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7611fae-10f2-4c93-a3ad-2f23c7e94788-metrics-tls\") pod \"dns-default-qcjlc\" (UID: \"a7611fae-10f2-4c93-a3ad-2f23c7e94788\") " pod="openshift-dns/dns-default-qcjlc" Apr 17 14:28:45.760676 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:45.760641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:28:45.763027 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:45.763004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f5fb50-c788-4c19-a249-06bd9abaa213-cert\") pod \"ingress-canary-cwbck\" (UID: \"57f5fb50-c788-4c19-a249-06bd9abaa213\") " pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:28:45.928806 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:45.928728 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p2qzh\"" Apr 17 14:28:45.937591 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:45.937575 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qcjlc" Apr 17 14:28:46.054090 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.054060 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qcjlc"] Apr 17 14:28:46.057461 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:46.057436 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7611fae_10f2_4c93_a3ad_2f23c7e94788.slice/crio-22908983a5752276801eb081158fe77cf63861e0f623bbf8a8b616a8cfff1d58 WatchSource:0}: Error finding container 22908983a5752276801eb081158fe77cf63861e0f623bbf8a8b616a8cfff1d58: Status 404 returned error can't find the container with id 22908983a5752276801eb081158fe77cf63861e0f623bbf8a8b616a8cfff1d58 Apr 17 14:28:46.480478 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.480444 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-2zg74"] Apr 17 14:28:46.483588 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.483566 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:28:46.485938 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.485913 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hzzql\"" Apr 17 14:28:46.486086 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.486067 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 14:28:46.486159 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.486084 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 14:28:46.493109 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.493089 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-2zg74"] Apr 17 14:28:46.567252 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.567210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpfbc\" (UniqueName: \"kubernetes.io/projected/ed4f570a-52ab-4296-95f4-b904558a1925-kube-api-access-zpfbc\") pod \"downloads-6bcc868b7-2zg74\" (UID: \"ed4f570a-52ab-4296-95f4-b904558a1925\") " pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:28:46.667929 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.667880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpfbc\" (UniqueName: \"kubernetes.io/projected/ed4f570a-52ab-4296-95f4-b904558a1925-kube-api-access-zpfbc\") pod \"downloads-6bcc868b7-2zg74\" (UID: \"ed4f570a-52ab-4296-95f4-b904558a1925\") " pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:28:46.677403 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.677376 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpfbc\" (UniqueName: \"kubernetes.io/projected/ed4f570a-52ab-4296-95f4-b904558a1925-kube-api-access-zpfbc\") pod \"downloads-6bcc868b7-2zg74\" (UID: \"ed4f570a-52ab-4296-95f4-b904558a1925\") " pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:28:46.795225 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.795153 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:28:46.850079 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:46.850035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcjlc" event={"ID":"a7611fae-10f2-4c93-a3ad-2f23c7e94788","Type":"ContainerStarted","Data":"22908983a5752276801eb081158fe77cf63861e0f623bbf8a8b616a8cfff1d58"} Apr 17 14:28:47.275740 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.275633 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-2zg74"] Apr 17 14:28:47.278750 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:47.278724 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded4f570a_52ab_4296_95f4_b904558a1925.slice/crio-47bbad4c199faebd4ddec0c1cc2b2599e4c5ebf8e5723ae3294139be5bc4d8bd WatchSource:0}: Error finding container 47bbad4c199faebd4ddec0c1cc2b2599e4c5ebf8e5723ae3294139be5bc4d8bd: Status 404 returned error can't find the container with id 47bbad4c199faebd4ddec0c1cc2b2599e4c5ebf8e5723ae3294139be5bc4d8bd Apr 17 14:28:47.629834 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.629808 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:47.854353 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.854322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcjlc" event={"ID":"a7611fae-10f2-4c93-a3ad-2f23c7e94788","Type":"ContainerStarted","Data":"64c21cadc576e175504800ae2373cecc229d65da4d155a8d53ec2128f4c294ad"} Apr 17 14:28:47.854353 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.854355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qcjlc" event={"ID":"a7611fae-10f2-4c93-a3ad-2f23c7e94788","Type":"ContainerStarted","Data":"20064b463e63437afc26cd6e03b451154433041e0e5a013b64457933243ddb5b"} Apr 17 14:28:47.854851 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.854444 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qcjlc" Apr 17 14:28:47.855319 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.855301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-2zg74" event={"ID":"ed4f570a-52ab-4296-95f4-b904558a1925","Type":"ContainerStarted","Data":"47bbad4c199faebd4ddec0c1cc2b2599e4c5ebf8e5723ae3294139be5bc4d8bd"} Apr 17 14:28:47.871027 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:47.870960 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qcjlc" podStartSLOduration=129.728492108 podStartE2EDuration="2m10.870944499s" podCreationTimestamp="2026-04-17 14:26:37 +0000 UTC" firstStartedPulling="2026-04-17 14:28:46.059168692 +0000 UTC m=+162.250767794" lastFinishedPulling="2026-04-17 14:28:47.201621087 +0000 UTC m=+163.393220185" observedRunningTime="2026-04-17 14:28:47.870502732 +0000 UTC m=+164.062101862" watchObservedRunningTime="2026-04-17 14:28:47.870944499 +0000 UTC m=+164.062543621" Apr 17 14:28:49.799703 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:49.799675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d97df5bf7-xlprx" Apr 17 14:28:52.644301 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.644236 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" podUID="0589e90a-065b-40ad-b760-71ddd18d5f83" containerName="registry" containerID="cri-o://116e34a911231f3fe88af22c42cef216a06c0cd9137ff6e4cadf2e69d8546f0e" gracePeriod=30 Apr 17 14:28:52.870752 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.870718 2572 generic.go:358] "Generic (PLEG): container finished" podID="0589e90a-065b-40ad-b760-71ddd18d5f83" containerID="116e34a911231f3fe88af22c42cef216a06c0cd9137ff6e4cadf2e69d8546f0e" exitCode=0 Apr 17 14:28:52.870882 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.870787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" event={"ID":"0589e90a-065b-40ad-b760-71ddd18d5f83","Type":"ContainerDied","Data":"116e34a911231f3fe88af22c42cef216a06c0cd9137ff6e4cadf2e69d8546f0e"} Apr 17 14:28:52.870882 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.870824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" event={"ID":"0589e90a-065b-40ad-b760-71ddd18d5f83","Type":"ContainerDied","Data":"457236770e1c25de78eebb4297c1e51ae372f2ce4a0b7af2326e9ff48285b33e"} Apr 17 14:28:52.870882 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.870836 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="457236770e1c25de78eebb4297c1e51ae372f2ce4a0b7af2326e9ff48285b33e" Apr 17 14:28:52.880175 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.880152 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:52.924874 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924767 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzwsb\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-kube-api-access-xzwsb\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.924874 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924807 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-installation-pull-secrets\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.924874 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924844 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-image-registry-private-configuration\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.925148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924896 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-bound-sa-token\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.925148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924920 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.925148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924944 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0589e90a-065b-40ad-b760-71ddd18d5f83-ca-trust-extracted\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.925148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.924969 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-certificates\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.925148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.925027 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-trusted-ca\") pod \"0589e90a-065b-40ad-b760-71ddd18d5f83\" (UID: \"0589e90a-065b-40ad-b760-71ddd18d5f83\") " Apr 17 14:28:52.925664 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.925615 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:28:52.926025 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.925971 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:28:52.927763 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.927735 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:28:52.928184 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.928140 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:28:52.928299 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.928256 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:28:52.928521 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.928470 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:28:52.928806 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.928782 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-kube-api-access-xzwsb" (OuterVolumeSpecName: "kube-api-access-xzwsb") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "kube-api-access-xzwsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:28:52.935776 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:52.935755 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0589e90a-065b-40ad-b760-71ddd18d5f83-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0589e90a-065b-40ad-b760-71ddd18d5f83" (UID: "0589e90a-065b-40ad-b760-71ddd18d5f83"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:53.026044 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026013 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzwsb\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-kube-api-access-xzwsb\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026044 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026039 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-installation-pull-secrets\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026044 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026050 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0589e90a-065b-40ad-b760-71ddd18d5f83-image-registry-private-configuration\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026249 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026061 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-bound-sa-token\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026249 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026071 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-tls\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026249 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026079 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0589e90a-065b-40ad-b760-71ddd18d5f83-ca-trust-extracted\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026249 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026088 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-registry-certificates\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.026249 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.026096 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589e90a-065b-40ad-b760-71ddd18d5f83-trusted-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.873644 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.873612 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-666d95ffc5-vtfqx" Apr 17 14:28:53.892654 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.892627 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-666d95ffc5-vtfqx"] Apr 17 14:28:53.895897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:53.895876 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-666d95ffc5-vtfqx"] Apr 17 14:28:54.393178 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.393147 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85fd57c4cb-57hjs"] Apr 17 14:28:54.393447 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.393433 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0589e90a-065b-40ad-b760-71ddd18d5f83" containerName="registry" Apr 17 14:28:54.393505 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.393451 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0589e90a-065b-40ad-b760-71ddd18d5f83" containerName="registry" Apr 17 14:28:54.393541 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.393518 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0589e90a-065b-40ad-b760-71ddd18d5f83" containerName="registry" Apr 17 14:28:54.398216 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.398196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.398777 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.398678 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:28:54.398945 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.398924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:28:54.400901 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.400882 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 14:28:54.401029 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.400936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 14:28:54.401962 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.401939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 14:28:54.402241 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.402219 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 14:28:54.402328 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.402248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 14:28:54.403083 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.402626 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qz9tk\"" Apr 17 14:28:54.403083 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.402729 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gg9xx\"" Apr 17 14:28:54.403238 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.403110 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0589e90a-065b-40ad-b760-71ddd18d5f83" path="/var/lib/kubelet/pods/0589e90a-065b-40ad-b760-71ddd18d5f83/volumes" Apr 17 14:28:54.404267 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.404183 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85fd57c4cb-57hjs"] Apr 17 14:28:54.409539 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.409518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cwbck" Apr 17 14:28:54.533563 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.533529 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cwbck"] Apr 17 14:28:54.536610 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:54.536577 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f5fb50_c788_4c19_a249_06bd9abaa213.slice/crio-00d6920f7e0c4bccc5d40c456525741cccf01feb1fd69585ed799297d52f6ed4 WatchSource:0}: Error finding container 00d6920f7e0c4bccc5d40c456525741cccf01feb1fd69585ed799297d52f6ed4: Status 404 returned error can't find the container with id 00d6920f7e0c4bccc5d40c456525741cccf01feb1fd69585ed799297d52f6ed4 Apr 17 14:28:54.540112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.538848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-serving-cert\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.540112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.538913 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-service-ca\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.540112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.538945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-console-config\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.540112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.539038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-oauth-config\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.540112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.539076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjkhb\" (UniqueName: \"kubernetes.io/projected/94b4a050-d044-4bef-8067-f5bc840aa50d-kube-api-access-zjkhb\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.540112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.539144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-oauth-serving-cert\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.639629 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.639590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-oauth-serving-cert\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.639833 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.639641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-serving-cert\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.639833 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.639673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-service-ca\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.639833 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.639704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-console-config\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.639833 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.639765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-oauth-config\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.639833 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.639802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjkhb\" (UniqueName: \"kubernetes.io/projected/94b4a050-d044-4bef-8067-f5bc840aa50d-kube-api-access-zjkhb\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.640413 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.640384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-oauth-serving-cert\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.640817 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.640786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-service-ca\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.641147 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.641124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-console-config\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.642653 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.642557 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-oauth-config\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.642828 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.642804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-serving-cert\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.647622 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.647572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjkhb\" (UniqueName: \"kubernetes.io/projected/94b4a050-d044-4bef-8067-f5bc840aa50d-kube-api-access-zjkhb\") pod \"console-85fd57c4cb-57hjs\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.708725 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.708697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:28:54.825532 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.825397 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85fd57c4cb-57hjs"] Apr 17 14:28:54.828496 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:28:54.828469 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b4a050_d044_4bef_8067_f5bc840aa50d.slice/crio-2731f0bc3a8d57018c1d6bd6d2295a5b5d8a60c5bdf772f2d4fc65b55af120f7 WatchSource:0}: Error finding container 2731f0bc3a8d57018c1d6bd6d2295a5b5d8a60c5bdf772f2d4fc65b55af120f7: Status 404 returned error can't find the container with id 2731f0bc3a8d57018c1d6bd6d2295a5b5d8a60c5bdf772f2d4fc65b55af120f7 Apr 17 14:28:54.877263 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.877229 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fd57c4cb-57hjs" event={"ID":"94b4a050-d044-4bef-8067-f5bc840aa50d","Type":"ContainerStarted","Data":"2731f0bc3a8d57018c1d6bd6d2295a5b5d8a60c5bdf772f2d4fc65b55af120f7"} Apr 17 14:28:54.878352 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:54.878322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cwbck" event={"ID":"57f5fb50-c788-4c19-a249-06bd9abaa213","Type":"ContainerStarted","Data":"00d6920f7e0c4bccc5d40c456525741cccf01feb1fd69585ed799297d52f6ed4"} Apr 17 14:28:56.886804 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:56.886762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cwbck" event={"ID":"57f5fb50-c788-4c19-a249-06bd9abaa213","Type":"ContainerStarted","Data":"8007976c8c3004438fb18c0447b8331ea13235476c7e2a2acf3cb4b2864dbdf6"} Apr 17 14:28:56.904156 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:56.904086 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cwbck" podStartSLOduration=137.727449738 podStartE2EDuration="2m19.904068082s" podCreationTimestamp="2026-04-17 14:26:37 +0000 UTC" firstStartedPulling="2026-04-17 14:28:54.541775703 +0000 UTC m=+170.733374806" lastFinishedPulling="2026-04-17 14:28:56.718394034 +0000 UTC m=+172.909993150" observedRunningTime="2026-04-17 14:28:56.904015416 +0000 UTC m=+173.095614538" watchObservedRunningTime="2026-04-17 14:28:56.904068082 +0000 UTC m=+173.095667204" Apr 17 14:28:57.860830 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:57.860793 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qcjlc" Apr 17 14:28:58.894900 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:58.894859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fd57c4cb-57hjs" event={"ID":"94b4a050-d044-4bef-8067-f5bc840aa50d","Type":"ContainerStarted","Data":"d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734"} Apr 17 14:28:58.910552 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:28:58.910447 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85fd57c4cb-57hjs" podStartSLOduration=1.6938423999999999 podStartE2EDuration="4.910431209s" podCreationTimestamp="2026-04-17 14:28:54 +0000 UTC" firstStartedPulling="2026-04-17 14:28:54.830363469 +0000 UTC m=+171.021962568" lastFinishedPulling="2026-04-17 14:28:58.046952275 +0000 UTC m=+174.238551377" observedRunningTime="2026-04-17 14:28:58.909849202 +0000 UTC m=+175.101448344" watchObservedRunningTime="2026-04-17 14:28:58.910431209 +0000 UTC m=+175.102030330" Apr 17 14:29:01.979861 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:01.979830 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-858898b5fb-dmkn2"] Apr 17 14:29:01.985304 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:01.985277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:01.993386 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:01.993338 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 14:29:01.995109 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:01.995077 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858898b5fb-dmkn2"] Apr 17 14:29:02.108699 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.108659 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-oauth-serving-cert\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.108881 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.108711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-oauth-config\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.108881 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.108814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-trusted-ca-bundle\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.108881 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.108865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-service-ca\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.109043 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.108921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm6x2\" (UniqueName: \"kubernetes.io/projected/6c192f88-10f6-41de-bc4f-e95b5877f94c-kube-api-access-bm6x2\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.109043 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.109005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-config\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.109043 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.109039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-serving-cert\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.209835 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.209802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-oauth-config\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210021 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.209850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-trusted-ca-bundle\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210021 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.209871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-service-ca\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210021 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.209925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bm6x2\" (UniqueName: \"kubernetes.io/projected/6c192f88-10f6-41de-bc4f-e95b5877f94c-kube-api-access-bm6x2\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210220 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-config\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210220 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-serving-cert\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210220 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-oauth-serving-cert\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-service-ca\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210709 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-config\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210782 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-oauth-serving-cert\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.210921 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.210896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-trusted-ca-bundle\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.212364 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.212343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-oauth-config\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.212479 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.212462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-serving-cert\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.218576 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.218553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm6x2\" (UniqueName: \"kubernetes.io/projected/6c192f88-10f6-41de-bc4f-e95b5877f94c-kube-api-access-bm6x2\") pod \"console-858898b5fb-dmkn2\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:02.296403 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:02.296316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:04.709110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:04.709073 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:29:04.709110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:04.709116 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:29:04.714456 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:04.714434 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:29:04.916318 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:04.916289 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:29:09.106669 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.106647 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858898b5fb-dmkn2"] Apr 17 14:29:09.108660 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:29:09.108638 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c192f88_10f6_41de_bc4f_e95b5877f94c.slice/crio-b2d2caf9d7a2ad0ac759a21a21edfe03dd7e9c52f9948dbe18a39b5085c6e8f2 WatchSource:0}: Error finding container b2d2caf9d7a2ad0ac759a21a21edfe03dd7e9c52f9948dbe18a39b5085c6e8f2: Status 404 returned error can't find the container with id b2d2caf9d7a2ad0ac759a21a21edfe03dd7e9c52f9948dbe18a39b5085c6e8f2 Apr 17 14:29:09.929064 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.929021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858898b5fb-dmkn2" event={"ID":"6c192f88-10f6-41de-bc4f-e95b5877f94c","Type":"ContainerStarted","Data":"97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f"} Apr 17 14:29:09.929270 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.929071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858898b5fb-dmkn2" event={"ID":"6c192f88-10f6-41de-bc4f-e95b5877f94c","Type":"ContainerStarted","Data":"b2d2caf9d7a2ad0ac759a21a21edfe03dd7e9c52f9948dbe18a39b5085c6e8f2"} Apr 17 14:29:09.930622 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.930590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-2zg74" event={"ID":"ed4f570a-52ab-4296-95f4-b904558a1925","Type":"ContainerStarted","Data":"dbdd0e2763e4aadbd4329edad7426b440415abe416aea9f71d712d48a35bf75b"} Apr 17 14:29:09.930827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.930804 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:29:09.941121 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.941096 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-2zg74" Apr 17 14:29:09.946214 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.946158 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-858898b5fb-dmkn2" podStartSLOduration=8.946146533 podStartE2EDuration="8.946146533s" podCreationTimestamp="2026-04-17 14:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:29:09.944799713 +0000 UTC m=+186.136398835" watchObservedRunningTime="2026-04-17 14:29:09.946146533 +0000 UTC m=+186.137745652" Apr 17 14:29:09.960402 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:09.960313 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-2zg74" podStartSLOduration=2.139302642 podStartE2EDuration="23.960296451s" podCreationTimestamp="2026-04-17 14:28:46 +0000 UTC" firstStartedPulling="2026-04-17 14:28:47.280804363 +0000 UTC m=+163.472403478" lastFinishedPulling="2026-04-17 14:29:09.101798175 +0000 UTC m=+185.293397287" observedRunningTime="2026-04-17 14:29:09.959509454 +0000 UTC m=+186.151108576" watchObservedRunningTime="2026-04-17 14:29:09.960296451 +0000 UTC m=+186.151895572" Apr 17 14:29:12.296700 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:12.296657 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:12.296700 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:12.296701 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:12.301670 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:12.301649 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:12.943834 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:12.943790 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:29:12.993340 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:12.993300 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85fd57c4cb-57hjs"] Apr 17 14:29:22.968530 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:22.968496 2572 generic.go:358] "Generic (PLEG): container finished" podID="7e8de3fc-6fef-4f2f-983d-9b545b473bf5" containerID="544af9d819921936defa1c0443de27f6ecc666b59066aaa9c52d6719e4bb14ab" exitCode=0 Apr 17 14:29:22.969002 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:22.968544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" event={"ID":"7e8de3fc-6fef-4f2f-983d-9b545b473bf5","Type":"ContainerDied","Data":"544af9d819921936defa1c0443de27f6ecc666b59066aaa9c52d6719e4bb14ab"} Apr 17 14:29:22.969002 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:22.968850 2572 scope.go:117] "RemoveContainer" containerID="544af9d819921936defa1c0443de27f6ecc666b59066aaa9c52d6719e4bb14ab" Apr 17 14:29:23.973754 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:23.973719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-297rk" event={"ID":"7e8de3fc-6fef-4f2f-983d-9b545b473bf5","Type":"ContainerStarted","Data":"50b85fb71d02bac55252a4d0fa0501383b24c614bacdf17d1b3e3be3530d1c03"} Apr 17 14:29:38.017274 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.017220 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85fd57c4cb-57hjs" podUID="94b4a050-d044-4bef-8067-f5bc840aa50d" containerName="console" containerID="cri-o://d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734" gracePeriod=15 Apr 17 14:29:38.281378 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.281357 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85fd57c4cb-57hjs_94b4a050-d044-4bef-8067-f5bc840aa50d/console/0.log" Apr 17 14:29:38.281496 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.281416 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:29:38.311173 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311144 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-oauth-config\") pod \"94b4a050-d044-4bef-8067-f5bc840aa50d\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " Apr 17 14:29:38.311320 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311208 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-console-config\") pod \"94b4a050-d044-4bef-8067-f5bc840aa50d\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " Apr 17 14:29:38.311320 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311261 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-serving-cert\") pod \"94b4a050-d044-4bef-8067-f5bc840aa50d\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " Apr 17 14:29:38.311320 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311292 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-service-ca\") pod \"94b4a050-d044-4bef-8067-f5bc840aa50d\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " Apr 17 14:29:38.311320 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311317 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjkhb\" (UniqueName: \"kubernetes.io/projected/94b4a050-d044-4bef-8067-f5bc840aa50d-kube-api-access-zjkhb\") pod \"94b4a050-d044-4bef-8067-f5bc840aa50d\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " Apr 17 14:29:38.311523 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311340 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-oauth-serving-cert\") pod \"94b4a050-d044-4bef-8067-f5bc840aa50d\" (UID: \"94b4a050-d044-4bef-8067-f5bc840aa50d\") " Apr 17 14:29:38.311677 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311653 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-service-ca" (OuterVolumeSpecName: "service-ca") pod "94b4a050-d044-4bef-8067-f5bc840aa50d" (UID: "94b4a050-d044-4bef-8067-f5bc840aa50d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:29:38.311763 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311646 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-console-config" (OuterVolumeSpecName: "console-config") pod "94b4a050-d044-4bef-8067-f5bc840aa50d" (UID: "94b4a050-d044-4bef-8067-f5bc840aa50d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:29:38.311851 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.311824 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "94b4a050-d044-4bef-8067-f5bc840aa50d" (UID: "94b4a050-d044-4bef-8067-f5bc840aa50d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:29:38.313603 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.313574 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "94b4a050-d044-4bef-8067-f5bc840aa50d" (UID: "94b4a050-d044-4bef-8067-f5bc840aa50d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:29:38.313603 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.313585 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b4a050-d044-4bef-8067-f5bc840aa50d-kube-api-access-zjkhb" (OuterVolumeSpecName: "kube-api-access-zjkhb") pod "94b4a050-d044-4bef-8067-f5bc840aa50d" (UID: "94b4a050-d044-4bef-8067-f5bc840aa50d"). InnerVolumeSpecName "kube-api-access-zjkhb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:29:38.313762 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.313668 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "94b4a050-d044-4bef-8067-f5bc840aa50d" (UID: "94b4a050-d044-4bef-8067-f5bc840aa50d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:29:38.412604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.412578 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjkhb\" (UniqueName: \"kubernetes.io/projected/94b4a050-d044-4bef-8067-f5bc840aa50d-kube-api-access-zjkhb\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:29:38.412604 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.412604 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-oauth-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:29:38.412744 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.412614 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-oauth-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:29:38.412744 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.412623 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-console-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:29:38.412744 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.412632 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94b4a050-d044-4bef-8067-f5bc840aa50d-console-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:29:38.412744 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:38.412641 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94b4a050-d044-4bef-8067-f5bc840aa50d-service-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:29:39.014159 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.014132 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85fd57c4cb-57hjs_94b4a050-d044-4bef-8067-f5bc840aa50d/console/0.log" Apr 17 14:29:39.014331 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.014167 2572 generic.go:358] "Generic (PLEG): container finished" podID="94b4a050-d044-4bef-8067-f5bc840aa50d" containerID="d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734" exitCode=2 Apr 17 14:29:39.014331 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.014228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fd57c4cb-57hjs" event={"ID":"94b4a050-d044-4bef-8067-f5bc840aa50d","Type":"ContainerDied","Data":"d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734"} Apr 17 14:29:39.014331 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.014251 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fd57c4cb-57hjs" event={"ID":"94b4a050-d044-4bef-8067-f5bc840aa50d","Type":"ContainerDied","Data":"2731f0bc3a8d57018c1d6bd6d2295a5b5d8a60c5bdf772f2d4fc65b55af120f7"} Apr 17 14:29:39.014331 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.014266 2572 scope.go:117] "RemoveContainer" containerID="d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734" Apr 17 14:29:39.014331 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.014232 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fd57c4cb-57hjs" Apr 17 14:29:39.027961 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.027946 2572 scope.go:117] "RemoveContainer" containerID="d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734" Apr 17 14:29:39.028308 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:29:39.028274 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734\": container with ID starting with d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734 not found: ID does not exist" containerID="d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734" Apr 17 14:29:39.028356 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.028320 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734"} err="failed to get container status \"d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734\": rpc error: code = NotFound desc = could not find container \"d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734\": container with ID starting with d19a217af50fad3d57afc2be9d972a1557d2ffc835d12f2e1c980b88f833b734 not found: ID does not exist" Apr 17 14:29:39.029478 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.029460 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85fd57c4cb-57hjs"] Apr 17 14:29:39.032783 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:39.032762 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85fd57c4cb-57hjs"] Apr 17 14:29:40.402708 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:29:40.402678 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b4a050-d044-4bef-8067-f5bc840aa50d" path="/var/lib/kubelet/pods/94b4a050-d044-4bef-8067-f5bc840aa50d/volumes" Apr 17 14:30:08.333549 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.333514 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7484dcb6dc-hbxnz"] Apr 17 14:30:08.333963 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.333794 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94b4a050-d044-4bef-8067-f5bc840aa50d" containerName="console" Apr 17 14:30:08.333963 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.333805 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b4a050-d044-4bef-8067-f5bc840aa50d" containerName="console" Apr 17 14:30:08.333963 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.333850 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="94b4a050-d044-4bef-8067-f5bc840aa50d" containerName="console" Apr 17 14:30:08.338271 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.338243 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.347574 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.347551 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7484dcb6dc-hbxnz"] Apr 17 14:30:08.464552 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-oauth-serving-cert\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.464739 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-serving-cert\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.464739 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-service-ca\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.464739 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xq8\" (UniqueName: \"kubernetes.io/projected/443a3cc0-f629-418c-a415-debb7bbbe193-kube-api-access-x7xq8\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.464739 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-trusted-ca-bundle\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.464890 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-console-config\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.464890 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.464763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-oauth-config\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-service-ca\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565193 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xq8\" (UniqueName: \"kubernetes.io/projected/443a3cc0-f629-418c-a415-debb7bbbe193-kube-api-access-x7xq8\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565454 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-trusted-ca-bundle\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565454 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-console-config\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565454 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-oauth-config\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565454 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-oauth-serving-cert\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.565654 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.565529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-serving-cert\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.566084 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.566044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-service-ca\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.566084 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.566077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-oauth-serving-cert\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.566261 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.566119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-console-config\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.566261 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.566211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-trusted-ca-bundle\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.567789 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.567760 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-oauth-config\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.567941 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.567922 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-serving-cert\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.573347 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.573324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xq8\" (UniqueName: \"kubernetes.io/projected/443a3cc0-f629-418c-a415-debb7bbbe193-kube-api-access-x7xq8\") pod \"console-7484dcb6dc-hbxnz\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.648207 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.648119 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:08.765827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:08.765799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7484dcb6dc-hbxnz"] Apr 17 14:30:08.768808 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:30:08.768777 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443a3cc0_f629_418c_a415_debb7bbbe193.slice/crio-8ce5d482e603861bfcba7d52601854ef9bcbb012e98fb4157b4e216df12b9b8d WatchSource:0}: Error finding container 8ce5d482e603861bfcba7d52601854ef9bcbb012e98fb4157b4e216df12b9b8d: Status 404 returned error can't find the container with id 8ce5d482e603861bfcba7d52601854ef9bcbb012e98fb4157b4e216df12b9b8d Apr 17 14:30:09.099445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:09.099409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7484dcb6dc-hbxnz" event={"ID":"443a3cc0-f629-418c-a415-debb7bbbe193","Type":"ContainerStarted","Data":"8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6"} Apr 17 14:30:09.099445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:09.099442 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7484dcb6dc-hbxnz" event={"ID":"443a3cc0-f629-418c-a415-debb7bbbe193","Type":"ContainerStarted","Data":"8ce5d482e603861bfcba7d52601854ef9bcbb012e98fb4157b4e216df12b9b8d"} Apr 17 14:30:09.115761 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:09.115718 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7484dcb6dc-hbxnz" podStartSLOduration=1.115706947 podStartE2EDuration="1.115706947s" podCreationTimestamp="2026-04-17 14:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:30:09.114573902 +0000 UTC m=+245.306173044" watchObservedRunningTime="2026-04-17 14:30:09.115706947 +0000 UTC m=+245.307306067" Apr 17 14:30:16.226286 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:16.226248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:30:16.228472 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:16.228452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85225238-59be-4245-9db5-4b8fa923bf28-metrics-certs\") pod \"network-metrics-daemon-tr5b8\" (UID: \"85225238-59be-4245-9db5-4b8fa923bf28\") " pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:30:16.301474 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:16.301442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w222h\"" Apr 17 14:30:16.309423 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:16.309402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tr5b8" Apr 17 14:30:16.422972 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:16.422874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tr5b8"] Apr 17 14:30:16.425476 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:30:16.425448 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85225238_59be_4245_9db5_4b8fa923bf28.slice/crio-4242a8cef82958e7595edb38ec937fb541b99a2b43343ff8d41b09b6e8ac96fb WatchSource:0}: Error finding container 4242a8cef82958e7595edb38ec937fb541b99a2b43343ff8d41b09b6e8ac96fb: Status 404 returned error can't find the container with id 4242a8cef82958e7595edb38ec937fb541b99a2b43343ff8d41b09b6e8ac96fb Apr 17 14:30:17.120697 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:17.120653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tr5b8" event={"ID":"85225238-59be-4245-9db5-4b8fa923bf28","Type":"ContainerStarted","Data":"4242a8cef82958e7595edb38ec937fb541b99a2b43343ff8d41b09b6e8ac96fb"} Apr 17 14:30:18.124747 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:18.124710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tr5b8" event={"ID":"85225238-59be-4245-9db5-4b8fa923bf28","Type":"ContainerStarted","Data":"8d7b6309f9a6a4954f0aeed549b6417a9890616406a12f5fcc0d1fb5bd122a62"} Apr 17 14:30:18.124747 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:18.124748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tr5b8" event={"ID":"85225238-59be-4245-9db5-4b8fa923bf28","Type":"ContainerStarted","Data":"75df42ebb4b9b2c6d4ebbec59ef1b50ce0a15bc4a1020398c82c9fa035508bfd"} Apr 17 14:30:18.138725 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:18.138669 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tr5b8" podStartSLOduration=252.978233817 podStartE2EDuration="4m14.138653204s" podCreationTimestamp="2026-04-17 14:26:04 +0000 UTC" firstStartedPulling="2026-04-17 14:30:16.427291189 +0000 UTC m=+252.618890287" lastFinishedPulling="2026-04-17 14:30:17.587710569 +0000 UTC m=+253.779309674" observedRunningTime="2026-04-17 14:30:18.138571746 +0000 UTC m=+254.330170865" watchObservedRunningTime="2026-04-17 14:30:18.138653204 +0000 UTC m=+254.330252323" Apr 17 14:30:18.648889 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:18.648840 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:18.648889 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:18.648893 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:18.654376 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:18.654354 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:19.131921 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:19.131892 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:30:19.174621 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:19.174577 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-858898b5fb-dmkn2"] Apr 17 14:30:44.194329 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.194261 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-858898b5fb-dmkn2" podUID="6c192f88-10f6-41de-bc4f-e95b5877f94c" containerName="console" containerID="cri-o://97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f" gracePeriod=15 Apr 17 14:30:44.434378 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.434353 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-858898b5fb-dmkn2_6c192f88-10f6-41de-bc4f-e95b5877f94c/console/0.log" Apr 17 14:30:44.434513 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.434414 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:30:44.532542 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532455 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-trusted-ca-bundle\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.532542 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532501 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-service-ca\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.532542 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532536 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-config\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.532793 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532563 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-serving-cert\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.532793 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532616 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-oauth-config\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.532793 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532658 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm6x2\" (UniqueName: \"kubernetes.io/projected/6c192f88-10f6-41de-bc4f-e95b5877f94c-kube-api-access-bm6x2\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.532945 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532825 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-oauth-serving-cert\") pod \"6c192f88-10f6-41de-bc4f-e95b5877f94c\" (UID: \"6c192f88-10f6-41de-bc4f-e95b5877f94c\") " Apr 17 14:30:44.533024 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.532948 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-service-ca" (OuterVolumeSpecName: "service-ca") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:30:44.533024 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.533002 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:30:44.533134 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.533048 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-config" (OuterVolumeSpecName: "console-config") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:30:44.533190 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.533162 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:44.533190 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.533178 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-trusted-ca-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:44.533190 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.533187 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-service-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:44.533346 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.533233 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:30:44.534879 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.534849 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:30:44.535004 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.534873 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:30:44.535054 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.535005 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c192f88-10f6-41de-bc4f-e95b5877f94c-kube-api-access-bm6x2" (OuterVolumeSpecName: "kube-api-access-bm6x2") pod "6c192f88-10f6-41de-bc4f-e95b5877f94c" (UID: "6c192f88-10f6-41de-bc4f-e95b5877f94c"). InnerVolumeSpecName "kube-api-access-bm6x2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:44.634550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.634512 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-oauth-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:44.634550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.634542 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bm6x2\" (UniqueName: \"kubernetes.io/projected/6c192f88-10f6-41de-bc4f-e95b5877f94c-kube-api-access-bm6x2\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:44.634550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.634554 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c192f88-10f6-41de-bc4f-e95b5877f94c-oauth-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:44.634550 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:44.634563 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c192f88-10f6-41de-bc4f-e95b5877f94c-console-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:30:45.200109 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.200082 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-858898b5fb-dmkn2_6c192f88-10f6-41de-bc4f-e95b5877f94c/console/0.log" Apr 17 14:30:45.200524 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.200119 2572 generic.go:358] "Generic (PLEG): container finished" podID="6c192f88-10f6-41de-bc4f-e95b5877f94c" containerID="97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f" exitCode=2 Apr 17 14:30:45.200524 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.200178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858898b5fb-dmkn2" event={"ID":"6c192f88-10f6-41de-bc4f-e95b5877f94c","Type":"ContainerDied","Data":"97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f"} Apr 17 14:30:45.200524 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.200200 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858898b5fb-dmkn2" event={"ID":"6c192f88-10f6-41de-bc4f-e95b5877f94c","Type":"ContainerDied","Data":"b2d2caf9d7a2ad0ac759a21a21edfe03dd7e9c52f9948dbe18a39b5085c6e8f2"} Apr 17 14:30:45.200524 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.200204 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858898b5fb-dmkn2" Apr 17 14:30:45.200524 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.200214 2572 scope.go:117] "RemoveContainer" containerID="97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f" Apr 17 14:30:45.208417 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.208400 2572 scope.go:117] "RemoveContainer" containerID="97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f" Apr 17 14:30:45.208656 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:30:45.208638 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f\": container with ID starting with 97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f not found: ID does not exist" containerID="97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f" Apr 17 14:30:45.208717 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.208669 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f"} err="failed to get container status \"97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f\": rpc error: code = NotFound desc = could not find container \"97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f\": container with ID starting with 97500123f6674cbf54a84c88c86f0fd20775991146101ae9f4eff63409e3189f not found: ID does not exist" Apr 17 14:30:45.220926 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.220898 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-858898b5fb-dmkn2"] Apr 17 14:30:45.223718 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:45.223693 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-858898b5fb-dmkn2"] Apr 17 14:30:46.403441 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:30:46.403407 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c192f88-10f6-41de-bc4f-e95b5877f94c" path="/var/lib/kubelet/pods/6c192f88-10f6-41de-bc4f-e95b5877f94c/volumes" Apr 17 14:31:04.274674 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:04.274089 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:31:21.701756 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.699800 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-748fb4d8c9-sx52p"] Apr 17 14:31:21.701756 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.700399 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c192f88-10f6-41de-bc4f-e95b5877f94c" containerName="console" Apr 17 14:31:21.701756 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.700426 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c192f88-10f6-41de-bc4f-e95b5877f94c" containerName="console" Apr 17 14:31:21.701756 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.700548 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c192f88-10f6-41de-bc4f-e95b5877f94c" containerName="console" Apr 17 14:31:21.704175 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.703947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.714665 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.714640 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748fb4d8c9-sx52p"] Apr 17 14:31:21.818576 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-config\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.818576 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-trusted-ca-bundle\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.818781 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-oauth-config\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.818781 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818666 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-service-ca\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.818859 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-serving-cert\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.818859 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j89jc\" (UniqueName: \"kubernetes.io/projected/8ae194f9-896b-4c8e-af2a-31c1d1247127-kube-api-access-j89jc\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.818859 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.818820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-oauth-serving-cert\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920164 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-serving-cert\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920164 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j89jc\" (UniqueName: \"kubernetes.io/projected/8ae194f9-896b-4c8e-af2a-31c1d1247127-kube-api-access-j89jc\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920164 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-oauth-serving-cert\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920499 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-config\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920499 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-trusted-ca-bundle\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920499 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-oauth-config\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.920499 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.920267 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-service-ca\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.921498 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.921469 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-service-ca\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.921693 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.921667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-oauth-serving-cert\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.921735 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.921687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-config\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.921919 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.921900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-trusted-ca-bundle\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.922822 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.922802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-oauth-config\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.922913 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.922896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-serving-cert\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:21.928630 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:21.928604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j89jc\" (UniqueName: \"kubernetes.io/projected/8ae194f9-896b-4c8e-af2a-31c1d1247127-kube-api-access-j89jc\") pod \"console-748fb4d8c9-sx52p\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:22.016628 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:22.016520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:22.134833 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:22.134806 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748fb4d8c9-sx52p"] Apr 17 14:31:22.137443 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:31:22.137415 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae194f9_896b_4c8e_af2a_31c1d1247127.slice/crio-752b5c7e01b13d67bcd674038051a1b9b04f117ffc820556b17231ac1ca069c3 WatchSource:0}: Error finding container 752b5c7e01b13d67bcd674038051a1b9b04f117ffc820556b17231ac1ca069c3: Status 404 returned error can't find the container with id 752b5c7e01b13d67bcd674038051a1b9b04f117ffc820556b17231ac1ca069c3 Apr 17 14:31:22.139303 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:22.139284 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:31:22.308573 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:22.308476 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748fb4d8c9-sx52p" event={"ID":"8ae194f9-896b-4c8e-af2a-31c1d1247127","Type":"ContainerStarted","Data":"010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568"} Apr 17 14:31:22.308573 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:22.308521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748fb4d8c9-sx52p" event={"ID":"8ae194f9-896b-4c8e-af2a-31c1d1247127","Type":"ContainerStarted","Data":"752b5c7e01b13d67bcd674038051a1b9b04f117ffc820556b17231ac1ca069c3"} Apr 17 14:31:22.324767 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:22.324708 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-748fb4d8c9-sx52p" podStartSLOduration=1.324693541 podStartE2EDuration="1.324693541s" podCreationTimestamp="2026-04-17 14:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:31:22.323059472 +0000 UTC m=+318.514658593" watchObservedRunningTime="2026-04-17 14:31:22.324693541 +0000 UTC m=+318.516292661" Apr 17 14:31:32.016652 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:32.016594 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:32.016652 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:32.016655 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:32.021362 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:32.021341 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:32.339410 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:32.339381 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:31:32.382763 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:32.382722 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7484dcb6dc-hbxnz"] Apr 17 14:31:57.405646 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.405553 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7484dcb6dc-hbxnz" podUID="443a3cc0-f629-418c-a415-debb7bbbe193" containerName="console" containerID="cri-o://8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6" gracePeriod=15 Apr 17 14:31:57.641657 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.641635 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7484dcb6dc-hbxnz_443a3cc0-f629-418c-a415-debb7bbbe193/console/0.log" Apr 17 14:31:57.641757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.641693 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:31:57.693709 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693638 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7xq8\" (UniqueName: \"kubernetes.io/projected/443a3cc0-f629-418c-a415-debb7bbbe193-kube-api-access-x7xq8\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.693709 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693670 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-oauth-serving-cert\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.693709 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693688 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-serving-cert\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.693928 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693729 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-trusted-ca-bundle\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.693928 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693762 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-console-config\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.693928 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693783 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-service-ca\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.693928 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.693802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-oauth-config\") pod \"443a3cc0-f629-418c-a415-debb7bbbe193\" (UID: \"443a3cc0-f629-418c-a415-debb7bbbe193\") " Apr 17 14:31:57.694152 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.694033 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:57.694271 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.694248 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-console-config" (OuterVolumeSpecName: "console-config") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:57.694332 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.694310 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-service-ca" (OuterVolumeSpecName: "service-ca") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:57.694384 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.694342 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:57.696507 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.696450 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:31:57.696507 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.696499 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443a3cc0-f629-418c-a415-debb7bbbe193-kube-api-access-x7xq8" (OuterVolumeSpecName: "kube-api-access-x7xq8") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "kube-api-access-x7xq8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:31:57.696671 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.696537 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "443a3cc0-f629-418c-a415-debb7bbbe193" (UID: "443a3cc0-f629-418c-a415-debb7bbbe193"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:31:57.795227 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795177 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-trusted-ca-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:57.795227 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795223 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-console-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:57.795227 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795233 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-service-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:57.795227 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795242 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-oauth-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:57.795494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795252 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7xq8\" (UniqueName: \"kubernetes.io/projected/443a3cc0-f629-418c-a415-debb7bbbe193-kube-api-access-x7xq8\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:57.795494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795262 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/443a3cc0-f629-418c-a415-debb7bbbe193-oauth-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:57.795494 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:57.795271 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/443a3cc0-f629-418c-a415-debb7bbbe193-console-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:31:58.407827 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.407796 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7484dcb6dc-hbxnz_443a3cc0-f629-418c-a415-debb7bbbe193/console/0.log" Apr 17 14:31:58.408317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.407834 2572 generic.go:358] "Generic (PLEG): container finished" podID="443a3cc0-f629-418c-a415-debb7bbbe193" containerID="8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6" exitCode=2 Apr 17 14:31:58.408317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.407878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7484dcb6dc-hbxnz" event={"ID":"443a3cc0-f629-418c-a415-debb7bbbe193","Type":"ContainerDied","Data":"8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6"} Apr 17 14:31:58.408317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.407897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7484dcb6dc-hbxnz" event={"ID":"443a3cc0-f629-418c-a415-debb7bbbe193","Type":"ContainerDied","Data":"8ce5d482e603861bfcba7d52601854ef9bcbb012e98fb4157b4e216df12b9b8d"} Apr 17 14:31:58.408317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.407911 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7484dcb6dc-hbxnz" Apr 17 14:31:58.408317 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.407917 2572 scope.go:117] "RemoveContainer" containerID="8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6" Apr 17 14:31:58.416205 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.416189 2572 scope.go:117] "RemoveContainer" containerID="8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6" Apr 17 14:31:58.416454 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:31:58.416435 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6\": container with ID starting with 8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6 not found: ID does not exist" containerID="8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6" Apr 17 14:31:58.416524 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.416465 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6"} err="failed to get container status \"8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6\": rpc error: code = NotFound desc = could not find container \"8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6\": container with ID starting with 8270a1952f0f3f46c6a277d7765abb6ea7a8a7cbec84b4425d4b99c74010c7e6 not found: ID does not exist" Apr 17 14:31:58.427046 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.427024 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7484dcb6dc-hbxnz"] Apr 17 14:31:58.430752 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:31:58.430732 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7484dcb6dc-hbxnz"] Apr 17 14:32:00.402645 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:00.402606 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443a3cc0-f629-418c-a415-debb7bbbe193" path="/var/lib/kubelet/pods/443a3cc0-f629-418c-a415-debb7bbbe193/volumes" Apr 17 14:32:51.626713 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.626679 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w"] Apr 17 14:32:51.627157 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.626947 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="443a3cc0-f629-418c-a415-debb7bbbe193" containerName="console" Apr 17 14:32:51.627157 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.626957 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="443a3cc0-f629-418c-a415-debb7bbbe193" containerName="console" Apr 17 14:32:51.627157 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.627034 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="443a3cc0-f629-418c-a415-debb7bbbe193" containerName="console" Apr 17 14:32:51.630008 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.629968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.632489 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.632466 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:32:51.633560 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.633544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7vhgb\"" Apr 17 14:32:51.633628 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.633544 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:32:51.641559 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.641534 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w"] Apr 17 14:32:51.796912 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.796872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.796912 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.796930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6zr\" (UniqueName: \"kubernetes.io/projected/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-kube-api-access-tc6zr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.797165 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.796957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.898093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.898004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.898093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.898042 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6zr\" (UniqueName: \"kubernetes.io/projected/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-kube-api-access-tc6zr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.898093 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.898070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.898427 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.898411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.898469 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.898409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.906583 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.906550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6zr\" (UniqueName: \"kubernetes.io/projected/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-kube-api-access-tc6zr\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:51.939576 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:51.939530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:32:52.054745 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:52.054709 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w"] Apr 17 14:32:52.057500 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:32:52.057470 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a6b216_db75_4c8c_a0a0_aff2447fcb3d.slice/crio-ad5b5bbb7099a3df968904f188150f63c4e6ede12ac49249707540ab58760236 WatchSource:0}: Error finding container ad5b5bbb7099a3df968904f188150f63c4e6ede12ac49249707540ab58760236: Status 404 returned error can't find the container with id ad5b5bbb7099a3df968904f188150f63c4e6ede12ac49249707540ab58760236 Apr 17 14:32:52.547271 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:52.547234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" event={"ID":"59a6b216-db75-4c8c-a0a0-aff2447fcb3d","Type":"ContainerStarted","Data":"ad5b5bbb7099a3df968904f188150f63c4e6ede12ac49249707540ab58760236"} Apr 17 14:32:57.563582 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:57.563551 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" event={"ID":"59a6b216-db75-4c8c-a0a0-aff2447fcb3d","Type":"ContainerStarted","Data":"3ca358c6c09110aee650f1c6190a9ff653db4ed703ffbbb3d22ef498166f56d8"} Apr 17 14:32:58.566907 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:58.566872 2572 generic.go:358] "Generic (PLEG): container finished" podID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerID="3ca358c6c09110aee650f1c6190a9ff653db4ed703ffbbb3d22ef498166f56d8" exitCode=0 Apr 17 14:32:58.567345 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:32:58.566954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" event={"ID":"59a6b216-db75-4c8c-a0a0-aff2447fcb3d","Type":"ContainerDied","Data":"3ca358c6c09110aee650f1c6190a9ff653db4ed703ffbbb3d22ef498166f56d8"} Apr 17 14:33:01.576661 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:01.576624 2572 generic.go:358] "Generic (PLEG): container finished" podID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerID="0ee730754104fc80e9804e08f9e9e000489bb797d8072d2d25c2fceb98bcbdb6" exitCode=0 Apr 17 14:33:01.577062 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:01.576700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" event={"ID":"59a6b216-db75-4c8c-a0a0-aff2447fcb3d","Type":"ContainerDied","Data":"0ee730754104fc80e9804e08f9e9e000489bb797d8072d2d25c2fceb98bcbdb6"} Apr 17 14:33:08.597689 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:08.597654 2572 generic.go:358] "Generic (PLEG): container finished" podID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerID="c737d89851ae89fbebda72eeb94f284ed5c29de3f3787419e5841ef3b56befbc" exitCode=0 Apr 17 14:33:08.598044 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:08.597703 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" event={"ID":"59a6b216-db75-4c8c-a0a0-aff2447fcb3d","Type":"ContainerDied","Data":"c737d89851ae89fbebda72eeb94f284ed5c29de3f3787419e5841ef3b56befbc"} Apr 17 14:33:09.716716 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.716695 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:33:09.741919 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.741890 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-util\") pod \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " Apr 17 14:33:09.742122 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.741931 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-bundle\") pod \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " Apr 17 14:33:09.742122 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.741999 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc6zr\" (UniqueName: \"kubernetes.io/projected/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-kube-api-access-tc6zr\") pod \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\" (UID: \"59a6b216-db75-4c8c-a0a0-aff2447fcb3d\") " Apr 17 14:33:09.742532 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.742500 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-bundle" (OuterVolumeSpecName: "bundle") pod "59a6b216-db75-4c8c-a0a0-aff2447fcb3d" (UID: "59a6b216-db75-4c8c-a0a0-aff2447fcb3d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:33:09.744172 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.744146 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-kube-api-access-tc6zr" (OuterVolumeSpecName: "kube-api-access-tc6zr") pod "59a6b216-db75-4c8c-a0a0-aff2447fcb3d" (UID: "59a6b216-db75-4c8c-a0a0-aff2447fcb3d"). InnerVolumeSpecName "kube-api-access-tc6zr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:33:09.746453 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.746428 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-util" (OuterVolumeSpecName: "util") pod "59a6b216-db75-4c8c-a0a0-aff2447fcb3d" (UID: "59a6b216-db75-4c8c-a0a0-aff2447fcb3d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:33:09.843103 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.843068 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-util\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:33:09.843103 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.843097 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:33:09.843103 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:09.843108 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc6zr\" (UniqueName: \"kubernetes.io/projected/59a6b216-db75-4c8c-a0a0-aff2447fcb3d-kube-api-access-tc6zr\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:33:10.605270 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:10.605236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" event={"ID":"59a6b216-db75-4c8c-a0a0-aff2447fcb3d","Type":"ContainerDied","Data":"ad5b5bbb7099a3df968904f188150f63c4e6ede12ac49249707540ab58760236"} Apr 17 14:33:10.605270 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:10.605273 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad5b5bbb7099a3df968904f188150f63c4e6ede12ac49249707540ab58760236" Apr 17 14:33:10.605470 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:10.605280 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5lcd4w" Apr 17 14:33:13.749013 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.748964 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z"] Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749243 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="util" Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749255 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="util" Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749263 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="extract" Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749269 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="extract" Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749289 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="pull" Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749295 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="pull" Apr 17 14:33:13.749473 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.749335 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="59a6b216-db75-4c8c-a0a0-aff2447fcb3d" containerName="extract" Apr 17 14:33:13.782818 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.782792 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z"] Apr 17 14:33:13.782967 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.782898 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:13.785609 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.785587 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:33:13.785745 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.785668 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-2q96m\"" Apr 17 14:33:13.785745 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.785735 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 14:33:13.872206 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.872174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b22fd5a-1a46-446e-9b26-8dcf33e078e2-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-8vf4z\" (UID: \"9b22fd5a-1a46-446e-9b26-8dcf33e078e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:13.872387 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.872253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gh8f\" (UniqueName: \"kubernetes.io/projected/9b22fd5a-1a46-446e-9b26-8dcf33e078e2-kube-api-access-7gh8f\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-8vf4z\" (UID: \"9b22fd5a-1a46-446e-9b26-8dcf33e078e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:13.973931 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.973881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b22fd5a-1a46-446e-9b26-8dcf33e078e2-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-8vf4z\" (UID: \"9b22fd5a-1a46-446e-9b26-8dcf33e078e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:13.974292 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.974268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gh8f\" (UniqueName: \"kubernetes.io/projected/9b22fd5a-1a46-446e-9b26-8dcf33e078e2-kube-api-access-7gh8f\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-8vf4z\" (UID: \"9b22fd5a-1a46-446e-9b26-8dcf33e078e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:13.974448 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.974305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b22fd5a-1a46-446e-9b26-8dcf33e078e2-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-8vf4z\" (UID: \"9b22fd5a-1a46-446e-9b26-8dcf33e078e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:13.982866 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:13.982842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gh8f\" (UniqueName: \"kubernetes.io/projected/9b22fd5a-1a46-446e-9b26-8dcf33e078e2-kube-api-access-7gh8f\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-8vf4z\" (UID: \"9b22fd5a-1a46-446e-9b26-8dcf33e078e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:14.092119 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:14.092070 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" Apr 17 14:33:14.210034 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:14.210002 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z"] Apr 17 14:33:14.213963 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:33:14.213935 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b22fd5a_1a46_446e_9b26_8dcf33e078e2.slice/crio-3be12fe23f3ad942156965523c36746ff4b417675b01ff976e4b0b36619ec5dc WatchSource:0}: Error finding container 3be12fe23f3ad942156965523c36746ff4b417675b01ff976e4b0b36619ec5dc: Status 404 returned error can't find the container with id 3be12fe23f3ad942156965523c36746ff4b417675b01ff976e4b0b36619ec5dc Apr 17 14:33:14.617149 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:14.617110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" event={"ID":"9b22fd5a-1a46-446e-9b26-8dcf33e078e2","Type":"ContainerStarted","Data":"3be12fe23f3ad942156965523c36746ff4b417675b01ff976e4b0b36619ec5dc"} Apr 17 14:33:17.628999 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:17.628933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" event={"ID":"9b22fd5a-1a46-446e-9b26-8dcf33e078e2","Type":"ContainerStarted","Data":"ea621b5b1901b0c08561fc3080e0f3913ea5db0121e637fc33d3cc4f5544932d"} Apr 17 14:33:17.647689 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:17.647630 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-8vf4z" podStartSLOduration=2.045587192 podStartE2EDuration="4.647612662s" podCreationTimestamp="2026-04-17 14:33:13 +0000 UTC" firstStartedPulling="2026-04-17 14:33:14.216509769 +0000 UTC m=+430.408108870" lastFinishedPulling="2026-04-17 14:33:16.818535241 +0000 UTC m=+433.010134340" observedRunningTime="2026-04-17 14:33:17.645434805 +0000 UTC m=+433.837033926" watchObservedRunningTime="2026-04-17 14:33:17.647612662 +0000 UTC m=+433.839211784" Apr 17 14:33:20.998853 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:20.998814 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-td8sv"] Apr 17 14:33:21.001290 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.001270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.003841 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.003809 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:33:21.003964 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.003847 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:33:21.004750 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.004736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-jmvcp\"" Apr 17 14:33:21.012572 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.012550 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-td8sv"] Apr 17 14:33:21.031030 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.030993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpqsr\" (UniqueName: \"kubernetes.io/projected/9371365a-b0a3-41cb-92f2-4c54ac194289-kube-api-access-xpqsr\") pod \"cert-manager-webhook-597b96b99b-td8sv\" (UID: \"9371365a-b0a3-41cb-92f2-4c54ac194289\") " pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.031188 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.031037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9371365a-b0a3-41cb-92f2-4c54ac194289-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-td8sv\" (UID: \"9371365a-b0a3-41cb-92f2-4c54ac194289\") " pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.131999 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.131942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpqsr\" (UniqueName: \"kubernetes.io/projected/9371365a-b0a3-41cb-92f2-4c54ac194289-kube-api-access-xpqsr\") pod \"cert-manager-webhook-597b96b99b-td8sv\" (UID: \"9371365a-b0a3-41cb-92f2-4c54ac194289\") " pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.132158 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.132018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9371365a-b0a3-41cb-92f2-4c54ac194289-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-td8sv\" (UID: \"9371365a-b0a3-41cb-92f2-4c54ac194289\") " pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.139554 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.139521 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9371365a-b0a3-41cb-92f2-4c54ac194289-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-td8sv\" (UID: \"9371365a-b0a3-41cb-92f2-4c54ac194289\") " pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.139661 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.139635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpqsr\" (UniqueName: \"kubernetes.io/projected/9371365a-b0a3-41cb-92f2-4c54ac194289-kube-api-access-xpqsr\") pod \"cert-manager-webhook-597b96b99b-td8sv\" (UID: \"9371365a-b0a3-41cb-92f2-4c54ac194289\") " pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.325954 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.325861 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:21.447097 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.447032 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-td8sv"] Apr 17 14:33:21.449792 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:33:21.449761 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9371365a_b0a3_41cb_92f2_4c54ac194289.slice/crio-f9d89e1045441923ab0014d57dea05ed566604f2f093b3e6464dc229c782ce4c WatchSource:0}: Error finding container f9d89e1045441923ab0014d57dea05ed566604f2f093b3e6464dc229c782ce4c: Status 404 returned error can't find the container with id f9d89e1045441923ab0014d57dea05ed566604f2f093b3e6464dc229c782ce4c Apr 17 14:33:21.644052 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:21.644015 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" event={"ID":"9371365a-b0a3-41cb-92f2-4c54ac194289","Type":"ContainerStarted","Data":"f9d89e1045441923ab0014d57dea05ed566604f2f093b3e6464dc229c782ce4c"} Apr 17 14:33:24.654335 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:24.654300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" event={"ID":"9371365a-b0a3-41cb-92f2-4c54ac194289","Type":"ContainerStarted","Data":"f2a52b92fe056a3bbd1def2d44dc6a326ad726a33cc4575efb23f03ea50dfdbc"} Apr 17 14:33:24.654748 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:24.654351 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:24.668885 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:24.668837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" podStartSLOduration=2.10787435 podStartE2EDuration="4.668818556s" podCreationTimestamp="2026-04-17 14:33:20 +0000 UTC" firstStartedPulling="2026-04-17 14:33:21.452075099 +0000 UTC m=+437.643674197" lastFinishedPulling="2026-04-17 14:33:24.013019302 +0000 UTC m=+440.204618403" observedRunningTime="2026-04-17 14:33:24.667837389 +0000 UTC m=+440.859436545" watchObservedRunningTime="2026-04-17 14:33:24.668818556 +0000 UTC m=+440.860417676" Apr 17 14:33:30.659642 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.659606 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-td8sv" Apr 17 14:33:30.691994 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.691947 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-wcrsk"] Apr 17 14:33:30.694312 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.694291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:30.696522 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.696503 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-wmxc2\"" Apr 17 14:33:30.704481 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.704454 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wcrsk"] Apr 17 14:33:30.809692 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.809652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqnk4\" (UniqueName: \"kubernetes.io/projected/d7e6fd00-4546-4834-9d78-fc7b1314434d-kube-api-access-dqnk4\") pod \"cert-manager-759f64656b-wcrsk\" (UID: \"d7e6fd00-4546-4834-9d78-fc7b1314434d\") " pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:30.809869 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.809709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7e6fd00-4546-4834-9d78-fc7b1314434d-bound-sa-token\") pod \"cert-manager-759f64656b-wcrsk\" (UID: \"d7e6fd00-4546-4834-9d78-fc7b1314434d\") " pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:30.910951 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.910868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqnk4\" (UniqueName: \"kubernetes.io/projected/d7e6fd00-4546-4834-9d78-fc7b1314434d-kube-api-access-dqnk4\") pod \"cert-manager-759f64656b-wcrsk\" (UID: \"d7e6fd00-4546-4834-9d78-fc7b1314434d\") " pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:30.910951 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.910921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7e6fd00-4546-4834-9d78-fc7b1314434d-bound-sa-token\") pod \"cert-manager-759f64656b-wcrsk\" (UID: \"d7e6fd00-4546-4834-9d78-fc7b1314434d\") " pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:30.918870 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.918833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7e6fd00-4546-4834-9d78-fc7b1314434d-bound-sa-token\") pod \"cert-manager-759f64656b-wcrsk\" (UID: \"d7e6fd00-4546-4834-9d78-fc7b1314434d\") " pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:30.919020 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:30.918882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqnk4\" (UniqueName: \"kubernetes.io/projected/d7e6fd00-4546-4834-9d78-fc7b1314434d-kube-api-access-dqnk4\") pod \"cert-manager-759f64656b-wcrsk\" (UID: \"d7e6fd00-4546-4834-9d78-fc7b1314434d\") " pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:31.003603 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:31.003569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wcrsk" Apr 17 14:33:31.115000 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:31.114932 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wcrsk"] Apr 17 14:33:31.117599 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:33:31.117571 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e6fd00_4546_4834_9d78_fc7b1314434d.slice/crio-81ba23aded88dbbc3d5adff6676913bbbb6bb2ee803e43d50b41bbf400083cd6 WatchSource:0}: Error finding container 81ba23aded88dbbc3d5adff6676913bbbb6bb2ee803e43d50b41bbf400083cd6: Status 404 returned error can't find the container with id 81ba23aded88dbbc3d5adff6676913bbbb6bb2ee803e43d50b41bbf400083cd6 Apr 17 14:33:31.675988 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:31.675951 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wcrsk" event={"ID":"d7e6fd00-4546-4834-9d78-fc7b1314434d","Type":"ContainerStarted","Data":"f43dea59912425df16d228ca1a9389b7966ad0dbd385a8697e5096bfbbad7759"} Apr 17 14:33:31.676530 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:31.675999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wcrsk" event={"ID":"d7e6fd00-4546-4834-9d78-fc7b1314434d","Type":"ContainerStarted","Data":"81ba23aded88dbbc3d5adff6676913bbbb6bb2ee803e43d50b41bbf400083cd6"} Apr 17 14:33:31.690453 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:31.690405 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-wcrsk" podStartSLOduration=1.690388321 podStartE2EDuration="1.690388321s" podCreationTimestamp="2026-04-17 14:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:33:31.690054897 +0000 UTC m=+447.881654016" watchObservedRunningTime="2026-04-17 14:33:31.690388321 +0000 UTC m=+447.881987443" Apr 17 14:33:32.016627 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.016538 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p"] Apr 17 14:33:32.019139 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.019116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.021373 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.021353 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7vhgb\"" Apr 17 14:33:32.021480 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.021378 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:33:32.021480 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.021400 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:33:32.029089 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.029060 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p"] Apr 17 14:33:32.120117 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.120081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfgrl\" (UniqueName: \"kubernetes.io/projected/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-kube-api-access-vfgrl\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.120319 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.120132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.120319 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.120193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.221155 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.221115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfgrl\" (UniqueName: \"kubernetes.io/projected/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-kube-api-access-vfgrl\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.221319 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.221182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.221319 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.221229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.221560 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.221537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.221598 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.221572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.229717 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.229689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfgrl\" (UniqueName: \"kubernetes.io/projected/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-kube-api-access-vfgrl\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.329050 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.329021 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:32.446698 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.446663 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p"] Apr 17 14:33:32.449792 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:33:32.449766 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42072b0d_4987_4f4d_bae5_b2c529e6f6e5.slice/crio-00359ef4f98bc7789b347c35136c27b01e4c128cd5310de7d59c332c2225892b WatchSource:0}: Error finding container 00359ef4f98bc7789b347c35136c27b01e4c128cd5310de7d59c332c2225892b: Status 404 returned error can't find the container with id 00359ef4f98bc7789b347c35136c27b01e4c128cd5310de7d59c332c2225892b Apr 17 14:33:32.680591 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.680501 2572 generic.go:358] "Generic (PLEG): container finished" podID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerID="59f36584e963e5e4d2121dc1c6d678dc1a9fedc78cc7650e6c8f8be75860400d" exitCode=0 Apr 17 14:33:32.680948 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.680588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" event={"ID":"42072b0d-4987-4f4d-bae5-b2c529e6f6e5","Type":"ContainerDied","Data":"59f36584e963e5e4d2121dc1c6d678dc1a9fedc78cc7650e6c8f8be75860400d"} Apr 17 14:33:32.680948 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:32.680622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" event={"ID":"42072b0d-4987-4f4d-bae5-b2c529e6f6e5","Type":"ContainerStarted","Data":"00359ef4f98bc7789b347c35136c27b01e4c128cd5310de7d59c332c2225892b"} Apr 17 14:33:35.693281 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:35.693242 2572 generic.go:358] "Generic (PLEG): container finished" podID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerID="49a1db69f2612eb1fd2d02162be93c8aef49e0691f61c878c0c792c2153bdc65" exitCode=0 Apr 17 14:33:35.693644 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:35.693320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" event={"ID":"42072b0d-4987-4f4d-bae5-b2c529e6f6e5","Type":"ContainerDied","Data":"49a1db69f2612eb1fd2d02162be93c8aef49e0691f61c878c0c792c2153bdc65"} Apr 17 14:33:36.698119 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:36.698082 2572 generic.go:358] "Generic (PLEG): container finished" podID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerID="aacd6e633687956ebe0a46746390d8a381f533e36f041eab671f096bfce6632f" exitCode=0 Apr 17 14:33:36.698499 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:36.698131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" event={"ID":"42072b0d-4987-4f4d-bae5-b2c529e6f6e5","Type":"ContainerDied","Data":"aacd6e633687956ebe0a46746390d8a381f533e36f041eab671f096bfce6632f"} Apr 17 14:33:37.817601 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.817576 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:37.864789 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.864761 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-bundle\") pod \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " Apr 17 14:33:37.864966 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.864810 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfgrl\" (UniqueName: \"kubernetes.io/projected/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-kube-api-access-vfgrl\") pod \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " Apr 17 14:33:37.864966 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.864847 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-util\") pod \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\" (UID: \"42072b0d-4987-4f4d-bae5-b2c529e6f6e5\") " Apr 17 14:33:37.865205 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.865183 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-bundle" (OuterVolumeSpecName: "bundle") pod "42072b0d-4987-4f4d-bae5-b2c529e6f6e5" (UID: "42072b0d-4987-4f4d-bae5-b2c529e6f6e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:33:37.866899 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.866877 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-kube-api-access-vfgrl" (OuterVolumeSpecName: "kube-api-access-vfgrl") pod "42072b0d-4987-4f4d-bae5-b2c529e6f6e5" (UID: "42072b0d-4987-4f4d-bae5-b2c529e6f6e5"). InnerVolumeSpecName "kube-api-access-vfgrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:33:37.966369 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.966289 2572 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:33:37.966369 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:37.966318 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vfgrl\" (UniqueName: \"kubernetes.io/projected/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-kube-api-access-vfgrl\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:33:38.404288 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:38.404232 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-util" (OuterVolumeSpecName: "util") pod "42072b0d-4987-4f4d-bae5-b2c529e6f6e5" (UID: "42072b0d-4987-4f4d-bae5-b2c529e6f6e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:33:38.472379 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:38.472346 2572 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42072b0d-4987-4f4d-bae5-b2c529e6f6e5-util\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:33:38.705938 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:38.705862 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" Apr 17 14:33:38.706110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:38.705860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78ezmq7p" event={"ID":"42072b0d-4987-4f4d-bae5-b2c529e6f6e5","Type":"ContainerDied","Data":"00359ef4f98bc7789b347c35136c27b01e4c128cd5310de7d59c332c2225892b"} Apr 17 14:33:38.706110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:33:38.705962 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00359ef4f98bc7789b347c35136c27b01e4c128cd5310de7d59c332c2225892b" Apr 17 14:34:16.970429 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970391 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk"] Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970678 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="util" Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970688 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="util" Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970698 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="pull" Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970703 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="pull" Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970714 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="extract" Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970719 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="extract" Apr 17 14:34:16.970897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.970772 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="42072b0d-4987-4f4d-bae5-b2c529e6f6e5" containerName="extract" Apr 17 14:34:16.972942 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.972914 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:16.973250 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.973221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/5293946a-fc61-420d-927f-333ff1e97ba3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:16.973339 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.973297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52kn\" (UniqueName: \"kubernetes.io/projected/5293946a-fc61-420d-927f-333ff1e97ba3-kube-api-access-m52kn\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:16.973389 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.973349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5293946a-fc61-420d-927f-333ff1e97ba3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:16.975304 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.975283 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:34:16.975404 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.975285 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 17 14:34:16.975404 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.975382 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 17 14:34:16.976337 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.976319 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:34:16.976471 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.976453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-52552\"" Apr 17 14:34:16.985462 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:16.985419 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk"] Apr 17 14:34:17.074565 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.074528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m52kn\" (UniqueName: \"kubernetes.io/projected/5293946a-fc61-420d-927f-333ff1e97ba3-kube-api-access-m52kn\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.074746 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.074573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5293946a-fc61-420d-927f-333ff1e97ba3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.074746 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.074609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/5293946a-fc61-420d-927f-333ff1e97ba3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.075171 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.075151 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/5293946a-fc61-420d-927f-333ff1e97ba3-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.076866 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.076845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5293946a-fc61-420d-927f-333ff1e97ba3-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.082668 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.082646 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52kn\" (UniqueName: \"kubernetes.io/projected/5293946a-fc61-420d-927f-333ff1e97ba3-kube-api-access-m52kn\") pod \"kubeflow-trainer-controller-manager-55f5694779-88gdk\" (UID: \"5293946a-fc61-420d-927f-333ff1e97ba3\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.285279 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.285191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:17.405660 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.405630 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk"] Apr 17 14:34:17.408835 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:34:17.408806 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5293946a_fc61_420d_927f_333ff1e97ba3.slice/crio-c4621cfa4f09e1a3b5191df9fd13b1d3c653636fb1650eb73241a2aa36828d9d WatchSource:0}: Error finding container c4621cfa4f09e1a3b5191df9fd13b1d3c653636fb1650eb73241a2aa36828d9d: Status 404 returned error can't find the container with id c4621cfa4f09e1a3b5191df9fd13b1d3c653636fb1650eb73241a2aa36828d9d Apr 17 14:34:17.831729 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:17.831689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" event={"ID":"5293946a-fc61-420d-927f-333ff1e97ba3","Type":"ContainerStarted","Data":"c4621cfa4f09e1a3b5191df9fd13b1d3c653636fb1650eb73241a2aa36828d9d"} Apr 17 14:34:19.840336 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:19.840287 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" event={"ID":"5293946a-fc61-420d-927f-333ff1e97ba3","Type":"ContainerStarted","Data":"f8b58fd6bd76039dff2a93140c5f77bd53b25399bbe768c00227b86b6dc496c8"} Apr 17 14:34:19.840779 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:19.840455 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:34:19.855431 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:19.855377 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" podStartSLOduration=1.629952233 podStartE2EDuration="3.855363721s" podCreationTimestamp="2026-04-17 14:34:16 +0000 UTC" firstStartedPulling="2026-04-17 14:34:17.410460657 +0000 UTC m=+493.602059754" lastFinishedPulling="2026-04-17 14:34:19.635872144 +0000 UTC m=+495.827471242" observedRunningTime="2026-04-17 14:34:19.853427818 +0000 UTC m=+496.045026939" watchObservedRunningTime="2026-04-17 14:34:19.855363721 +0000 UTC m=+496.046963161" Apr 17 14:34:35.849390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:34:35.849355 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-88gdk" Apr 17 14:35:04.298001 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:04.297910 2572 scope.go:117] "RemoveContainer" containerID="116e34a911231f3fe88af22c42cef216a06c0cd9137ff6e4cadf2e69d8546f0e" Apr 17 14:35:17.593642 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.593608 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c58886c8c-r75dc"] Apr 17 14:35:17.596469 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.596446 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.607597 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.607574 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c58886c8c-r75dc"] Apr 17 14:35:17.744204 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-trusted-ca-bundle\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.744204 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-serving-cert\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.744411 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-service-ca\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.744411 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-oauth-serving-cert\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.744411 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-oauth-config\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.744515 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqjw\" (UniqueName: \"kubernetes.io/projected/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-kube-api-access-lbqjw\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.744515 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.744449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-config\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845165 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-service-ca\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845165 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-oauth-serving-cert\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845366 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-oauth-config\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845366 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqjw\" (UniqueName: \"kubernetes.io/projected/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-kube-api-access-lbqjw\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845366 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-config\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845366 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-trusted-ca-bundle\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845366 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-serving-cert\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845846 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-service-ca\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.845994 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.845912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-oauth-serving-cert\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.846058 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.846039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-config\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.846261 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.846236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-trusted-ca-bundle\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.847742 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.847716 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-oauth-config\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.847882 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.847866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-console-serving-cert\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.854229 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.854211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqjw\" (UniqueName: \"kubernetes.io/projected/b7b27802-a2a3-4521-ae52-cd1e1ddc60d7-kube-api-access-lbqjw\") pod \"console-c58886c8c-r75dc\" (UID: \"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7\") " pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:17.905720 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:17.905688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:18.025897 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:18.025865 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c58886c8c-r75dc"] Apr 17 14:35:18.029015 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:35:18.028969 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b27802_a2a3_4521_ae52_cd1e1ddc60d7.slice/crio-da5fa452c22f99d80ee816adb7b005a8667ce62e2b1d7323bcb640692c537f89 WatchSource:0}: Error finding container da5fa452c22f99d80ee816adb7b005a8667ce62e2b1d7323bcb640692c537f89: Status 404 returned error can't find the container with id da5fa452c22f99d80ee816adb7b005a8667ce62e2b1d7323bcb640692c537f89 Apr 17 14:35:19.032868 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:19.032830 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c58886c8c-r75dc" event={"ID":"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7","Type":"ContainerStarted","Data":"dad783bef9eae60cbc332ace75d7bd8145aa9806086bbcf46d4a0e84eefa02f1"} Apr 17 14:35:19.032868 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:19.032869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c58886c8c-r75dc" event={"ID":"b7b27802-a2a3-4521-ae52-cd1e1ddc60d7","Type":"ContainerStarted","Data":"da5fa452c22f99d80ee816adb7b005a8667ce62e2b1d7323bcb640692c537f89"} Apr 17 14:35:19.051110 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:19.051058 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c58886c8c-r75dc" podStartSLOduration=2.051044939 podStartE2EDuration="2.051044939s" podCreationTimestamp="2026-04-17 14:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:35:19.048445601 +0000 UTC m=+555.240044718" watchObservedRunningTime="2026-04-17 14:35:19.051044939 +0000 UTC m=+555.242644060" Apr 17 14:35:27.906322 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:27.906287 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:27.906772 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:27.906565 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:27.911357 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:27.911334 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:28.065160 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:28.065132 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c58886c8c-r75dc" Apr 17 14:35:28.108880 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:28.108665 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-748fb4d8c9-sx52p"] Apr 17 14:35:53.136523 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.136479 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-748fb4d8c9-sx52p" podUID="8ae194f9-896b-4c8e-af2a-31c1d1247127" containerName="console" containerID="cri-o://010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568" gracePeriod=15 Apr 17 14:35:53.380964 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.380944 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-748fb4d8c9-sx52p_8ae194f9-896b-4c8e-af2a-31c1d1247127/console/0.log" Apr 17 14:35:53.381086 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.381016 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:35:53.434201 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434130 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-serving-cert\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434201 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434166 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-oauth-serving-cert\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434201 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434196 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-config\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434234 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j89jc\" (UniqueName: \"kubernetes.io/projected/8ae194f9-896b-4c8e-af2a-31c1d1247127-kube-api-access-j89jc\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434299 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-oauth-config\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434322 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-service-ca\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434449 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434346 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-trusted-ca-bundle\") pod \"8ae194f9-896b-4c8e-af2a-31c1d1247127\" (UID: \"8ae194f9-896b-4c8e-af2a-31c1d1247127\") " Apr 17 14:35:53.434811 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434722 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:35:53.434811 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434725 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-config" (OuterVolumeSpecName: "console-config") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:35:53.434890 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434796 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-service-ca" (OuterVolumeSpecName: "service-ca") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:35:53.434890 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.434838 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:35:53.436529 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.436503 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:35:53.436742 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.436716 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:35:53.436742 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.436727 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae194f9-896b-4c8e-af2a-31c1d1247127-kube-api-access-j89jc" (OuterVolumeSpecName: "kube-api-access-j89jc") pod "8ae194f9-896b-4c8e-af2a-31c1d1247127" (UID: "8ae194f9-896b-4c8e-af2a-31c1d1247127"). InnerVolumeSpecName "kube-api-access-j89jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:35:53.535405 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535377 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:53.535405 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535401 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-oauth-serving-cert\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:53.535405 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535411 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:53.535593 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535421 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j89jc\" (UniqueName: \"kubernetes.io/projected/8ae194f9-896b-4c8e-af2a-31c1d1247127-kube-api-access-j89jc\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:53.535593 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535431 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8ae194f9-896b-4c8e-af2a-31c1d1247127-console-oauth-config\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:53.535593 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535439 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-service-ca\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:53.535593 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:53.535448 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae194f9-896b-4c8e-af2a-31c1d1247127-trusted-ca-bundle\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:35:54.142048 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.142016 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-748fb4d8c9-sx52p_8ae194f9-896b-4c8e-af2a-31c1d1247127/console/0.log" Apr 17 14:35:54.142445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.142061 2572 generic.go:358] "Generic (PLEG): container finished" podID="8ae194f9-896b-4c8e-af2a-31c1d1247127" containerID="010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568" exitCode=2 Apr 17 14:35:54.142445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.142117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748fb4d8c9-sx52p" event={"ID":"8ae194f9-896b-4c8e-af2a-31c1d1247127","Type":"ContainerDied","Data":"010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568"} Apr 17 14:35:54.142445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.142135 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748fb4d8c9-sx52p" Apr 17 14:35:54.142445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.142148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748fb4d8c9-sx52p" event={"ID":"8ae194f9-896b-4c8e-af2a-31c1d1247127","Type":"ContainerDied","Data":"752b5c7e01b13d67bcd674038051a1b9b04f117ffc820556b17231ac1ca069c3"} Apr 17 14:35:54.142445 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.142166 2572 scope.go:117] "RemoveContainer" containerID="010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568" Apr 17 14:35:54.151148 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.151025 2572 scope.go:117] "RemoveContainer" containerID="010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568" Apr 17 14:35:54.151292 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:35:54.151273 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568\": container with ID starting with 010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568 not found: ID does not exist" containerID="010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568" Apr 17 14:35:54.151348 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.151302 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568"} err="failed to get container status \"010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568\": rpc error: code = NotFound desc = could not find container \"010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568\": container with ID starting with 010fab1ce564dbc89273ccb7d0bb43fd219d07fe962af8d9bd4497991c70c568 not found: ID does not exist" Apr 17 14:35:54.162593 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.162571 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-748fb4d8c9-sx52p"] Apr 17 14:35:54.165377 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.165356 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-748fb4d8c9-sx52p"] Apr 17 14:35:54.405232 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:35:54.405159 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae194f9-896b-4c8e-af2a-31c1d1247127" path="/var/lib/kubelet/pods/8ae194f9-896b-4c8e-af2a-31c1d1247127/volumes" Apr 17 14:47:01.578622 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.578543 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd"] Apr 17 14:47:01.579094 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.578830 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ae194f9-896b-4c8e-af2a-31c1d1247127" containerName="console" Apr 17 14:47:01.579094 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.578839 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae194f9-896b-4c8e-af2a-31c1d1247127" containerName="console" Apr 17 14:47:01.579094 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.578896 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ae194f9-896b-4c8e-af2a-31c1d1247127" containerName="console" Apr 17 14:47:01.580757 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.580742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:47:01.583325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.583299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hxhcp\"/\"kube-root-ca.crt\"" Apr 17 14:47:01.583325 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.583318 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-hxhcp\"/\"default-dockercfg-dkrln\"" Apr 17 14:47:01.583499 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.583299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hxhcp\"/\"openshift-service-ca.crt\"" Apr 17 14:47:01.589624 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.589602 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd"] Apr 17 14:47:01.689368 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.689328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjgt\" (UniqueName: \"kubernetes.io/projected/ee980a35-78fb-4bcb-9b58-e56c3a5aac13-kube-api-access-bjjgt\") pod \"test-trainjob-2d482-node-0-0-j4tjd\" (UID: \"ee980a35-78fb-4bcb-9b58-e56c3a5aac13\") " pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:47:01.790390 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.790355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjgt\" (UniqueName: \"kubernetes.io/projected/ee980a35-78fb-4bcb-9b58-e56c3a5aac13-kube-api-access-bjjgt\") pod \"test-trainjob-2d482-node-0-0-j4tjd\" (UID: \"ee980a35-78fb-4bcb-9b58-e56c3a5aac13\") " pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:47:01.798382 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.798353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjgt\" (UniqueName: \"kubernetes.io/projected/ee980a35-78fb-4bcb-9b58-e56c3a5aac13-kube-api-access-bjjgt\") pod \"test-trainjob-2d482-node-0-0-j4tjd\" (UID: \"ee980a35-78fb-4bcb-9b58-e56c3a5aac13\") " pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:47:01.891310 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:01.891228 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:47:02.013237 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:02.013145 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd"] Apr 17 14:47:02.015997 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:47:02.015952 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee980a35_78fb_4bcb_9b58_e56c3a5aac13.slice/crio-a15672b6c1f1a3d5a2181073fd3d9ef02688685862c1717b0efcb0be8ad5e3ca WatchSource:0}: Error finding container a15672b6c1f1a3d5a2181073fd3d9ef02688685862c1717b0efcb0be8ad5e3ca: Status 404 returned error can't find the container with id a15672b6c1f1a3d5a2181073fd3d9ef02688685862c1717b0efcb0be8ad5e3ca Apr 17 14:47:02.018422 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:02.018403 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:47:02.267637 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:47:02.267550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" event={"ID":"ee980a35-78fb-4bcb-9b58-e56c3a5aac13","Type":"ContainerStarted","Data":"a15672b6c1f1a3d5a2181073fd3d9ef02688685862c1717b0efcb0be8ad5e3ca"} Apr 17 14:54:00.668849 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:00.668814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" event={"ID":"ee980a35-78fb-4bcb-9b58-e56c3a5aac13","Type":"ContainerStarted","Data":"d0e2ce5e915c8a1b75b2eff08d428a66c149d9f939cc9d217783799c4f49631b"} Apr 17 14:54:00.671699 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:00.671677 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-hxhcp\"/\"default-dockercfg-dkrln\"" Apr 17 14:54:00.693782 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:00.693727 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" podStartSLOduration=2.118521772 podStartE2EDuration="6m59.693711358s" podCreationTimestamp="2026-04-17 14:47:01 +0000 UTC" firstStartedPulling="2026-04-17 14:47:02.018589764 +0000 UTC m=+1258.210188866" lastFinishedPulling="2026-04-17 14:53:59.593779349 +0000 UTC m=+1675.785378452" observedRunningTime="2026-04-17 14:54:00.692440241 +0000 UTC m=+1676.884039361" watchObservedRunningTime="2026-04-17 14:54:00.693711358 +0000 UTC m=+1676.885310477" Apr 17 14:54:00.829469 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:00.829438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hxhcp\"/\"kube-root-ca.crt\"" Apr 17 14:54:00.839679 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:00.839655 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-hxhcp\"/\"openshift-service-ca.crt\"" Apr 17 14:54:03.681275 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:03.681242 2572 generic.go:358] "Generic (PLEG): container finished" podID="ee980a35-78fb-4bcb-9b58-e56c3a5aac13" containerID="d0e2ce5e915c8a1b75b2eff08d428a66c149d9f939cc9d217783799c4f49631b" exitCode=0 Apr 17 14:54:03.681683 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:03.681295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" event={"ID":"ee980a35-78fb-4bcb-9b58-e56c3a5aac13","Type":"ContainerDied","Data":"d0e2ce5e915c8a1b75b2eff08d428a66c149d9f939cc9d217783799c4f49631b"} Apr 17 14:54:04.869570 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:04.869548 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:54:04.962900 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:04.962819 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjjgt\" (UniqueName: \"kubernetes.io/projected/ee980a35-78fb-4bcb-9b58-e56c3a5aac13-kube-api-access-bjjgt\") pod \"ee980a35-78fb-4bcb-9b58-e56c3a5aac13\" (UID: \"ee980a35-78fb-4bcb-9b58-e56c3a5aac13\") " Apr 17 14:54:04.965267 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:04.965237 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee980a35-78fb-4bcb-9b58-e56c3a5aac13-kube-api-access-bjjgt" (OuterVolumeSpecName: "kube-api-access-bjjgt") pod "ee980a35-78fb-4bcb-9b58-e56c3a5aac13" (UID: "ee980a35-78fb-4bcb-9b58-e56c3a5aac13"). InnerVolumeSpecName "kube-api-access-bjjgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:54:05.063485 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:05.063440 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bjjgt\" (UniqueName: \"kubernetes.io/projected/ee980a35-78fb-4bcb-9b58-e56c3a5aac13-kube-api-access-bjjgt\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 14:54:05.688906 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:05.688873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" event={"ID":"ee980a35-78fb-4bcb-9b58-e56c3a5aac13","Type":"ContainerDied","Data":"a15672b6c1f1a3d5a2181073fd3d9ef02688685862c1717b0efcb0be8ad5e3ca"} Apr 17 14:54:05.688906 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:05.688906 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a15672b6c1f1a3d5a2181073fd3d9ef02688685862c1717b0efcb0be8ad5e3ca" Apr 17 14:54:05.689135 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:05.688915 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd" Apr 17 14:54:06.292399 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.292335 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc"] Apr 17 14:54:06.292857 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.292793 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee980a35-78fb-4bcb-9b58-e56c3a5aac13" containerName="node" Apr 17 14:54:06.292857 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.292812 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee980a35-78fb-4bcb-9b58-e56c3a5aac13" containerName="node" Apr 17 14:54:06.292971 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.292894 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee980a35-78fb-4bcb-9b58-e56c3a5aac13" containerName="node" Apr 17 14:54:06.314934 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.314905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc"] Apr 17 14:54:06.315112 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.315043 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 14:54:06.317357 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.317329 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-r8q7b\"/\"kube-root-ca.crt\"" Apr 17 14:54:06.317489 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.317368 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-r8q7b\"/\"openshift-service-ca.crt\"" Apr 17 14:54:06.318283 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.318263 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-r8q7b\"/\"default-dockercfg-gzkpt\"" Apr 17 14:54:06.476891 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.476859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4phh\" (UniqueName: \"kubernetes.io/projected/f6d1ec23-9114-4290-9abb-6dd651f73dd2-kube-api-access-z4phh\") pod \"test-trainjob-8rtp6-node-0-0-xxjkc\" (UID: \"f6d1ec23-9114-4290-9abb-6dd651f73dd2\") " pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 14:54:06.578009 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.577893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4phh\" (UniqueName: \"kubernetes.io/projected/f6d1ec23-9114-4290-9abb-6dd651f73dd2-kube-api-access-z4phh\") pod \"test-trainjob-8rtp6-node-0-0-xxjkc\" (UID: \"f6d1ec23-9114-4290-9abb-6dd651f73dd2\") " pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 14:54:06.586102 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.586069 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4phh\" (UniqueName: \"kubernetes.io/projected/f6d1ec23-9114-4290-9abb-6dd651f73dd2-kube-api-access-z4phh\") pod \"test-trainjob-8rtp6-node-0-0-xxjkc\" (UID: \"f6d1ec23-9114-4290-9abb-6dd651f73dd2\") " pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 14:54:06.624071 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.624033 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 14:54:06.759092 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.759062 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc"] Apr 17 14:54:06.761300 ip-10-0-129-199 kubenswrapper[2572]: W0417 14:54:06.761272 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d1ec23_9114_4290_9abb_6dd651f73dd2.slice/crio-65d4464d2cfa02cdb29474168878a33f70cba531df466bf7c8d2ed55a80bbb90 WatchSource:0}: Error finding container 65d4464d2cfa02cdb29474168878a33f70cba531df466bf7c8d2ed55a80bbb90: Status 404 returned error can't find the container with id 65d4464d2cfa02cdb29474168878a33f70cba531df466bf7c8d2ed55a80bbb90 Apr 17 14:54:06.763356 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:06.763341 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:54:07.696866 ip-10-0-129-199 kubenswrapper[2572]: I0417 14:54:07.696832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" event={"ID":"f6d1ec23-9114-4290-9abb-6dd651f73dd2","Type":"ContainerStarted","Data":"65d4464d2cfa02cdb29474168878a33f70cba531df466bf7c8d2ed55a80bbb90"} Apr 17 14:54:17.110100 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:54:17.110045 2572 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6: reading manifest sha256:7be7ac57f27fb8d164f9a6f5dd0f432f0c4ea3781a9f47d7e1013dd1ae66ac95 in quay.io/opendatahub/odh-training-rocm64-torch29-py312: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 17 14:54:17.110517 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:54:17.110225 2572 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-8rtp6-node-0-0.test-trainjob-8rtp6,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z4phh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-8rtp6-node-0-0-xxjkc_test-ns-r8q7b(f6d1ec23-9114-4290-9abb-6dd651f73dd2): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6: reading manifest sha256:7be7ac57f27fb8d164f9a6f5dd0f432f0c4ea3781a9f47d7e1013dd1ae66ac95 in quay.io/opendatahub/odh-training-rocm64-torch29-py312: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 14:54:17.111412 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:54:17.111379 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6: reading manifest sha256:7be7ac57f27fb8d164f9a6f5dd0f432f0c4ea3781a9f47d7e1013dd1ae66ac95 in quay.io/opendatahub/odh-training-rocm64-torch29-py312: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" podUID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" Apr 17 14:54:17.728299 ip-10-0-129-199 kubenswrapper[2572]: E0417 14:54:17.728263 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6: reading manifest sha256:7be7ac57f27fb8d164f9a6f5dd0f432f0c4ea3781a9f47d7e1013dd1ae66ac95 in quay.io/opendatahub/odh-training-rocm64-torch29-py312: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" podUID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" Apr 17 15:02:28.237743 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:02:28.237628 2572 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 17 15:02:28.237743 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:02:28.237699 2572 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 17 15:02:28.237743 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:02:28.237715 2572 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 17 15:03:04.382284 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:03:04.382236 2572 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 17 15:03:04.382284 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:03:04.382286 2572 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 17 15:03:04.382784 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:03:04.382300 2572 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 17 15:03:04.387425 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:03:04.387399 2572 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 17 15:03:04.387515 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:03:04.387431 2572 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 17 15:03:04.387515 ip-10-0-129-199 kubenswrapper[2572]: E0417 15:03:04.387445 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 17 15:04:58.861682 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:04:58.861646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" event={"ID":"f6d1ec23-9114-4290-9abb-6dd651f73dd2","Type":"ContainerStarted","Data":"a5c098f75e215b261c4d8e8e89c0327124e7b2c7b25b3e33bc1d2e5f0a4a4bec"} Apr 17 15:04:58.864222 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:04:58.864202 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-r8q7b\"/\"default-dockercfg-gzkpt\"" Apr 17 15:04:58.886590 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:04:58.886538 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" podStartSLOduration=1.671601988 podStartE2EDuration="10m52.88652534s" podCreationTimestamp="2026-04-17 14:54:06 +0000 UTC" firstStartedPulling="2026-04-17 14:54:06.763464765 +0000 UTC m=+1682.955063864" lastFinishedPulling="2026-04-17 15:04:57.978387871 +0000 UTC m=+2334.169987216" observedRunningTime="2026-04-17 15:04:58.884801734 +0000 UTC m=+2335.076400865" watchObservedRunningTime="2026-04-17 15:04:58.88652534 +0000 UTC m=+2335.078124460" Apr 17 15:04:58.895254 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:04:58.895226 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-r8q7b\"/\"kube-root-ca.crt\"" Apr 17 15:04:58.905834 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:04:58.905812 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-r8q7b\"/\"openshift-service-ca.crt\"" Apr 17 15:05:14.917634 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:14.917554 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" containerID="a5c098f75e215b261c4d8e8e89c0327124e7b2c7b25b3e33bc1d2e5f0a4a4bec" exitCode=0 Apr 17 15:05:14.917634 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:14.917601 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" event={"ID":"f6d1ec23-9114-4290-9abb-6dd651f73dd2","Type":"ContainerDied","Data":"a5c098f75e215b261c4d8e8e89c0327124e7b2c7b25b3e33bc1d2e5f0a4a4bec"} Apr 17 15:05:16.076719 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.076694 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 15:05:16.129061 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.129024 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4phh\" (UniqueName: \"kubernetes.io/projected/f6d1ec23-9114-4290-9abb-6dd651f73dd2-kube-api-access-z4phh\") pod \"f6d1ec23-9114-4290-9abb-6dd651f73dd2\" (UID: \"f6d1ec23-9114-4290-9abb-6dd651f73dd2\") " Apr 17 15:05:16.131159 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.131135 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d1ec23-9114-4290-9abb-6dd651f73dd2-kube-api-access-z4phh" (OuterVolumeSpecName: "kube-api-access-z4phh") pod "f6d1ec23-9114-4290-9abb-6dd651f73dd2" (UID: "f6d1ec23-9114-4290-9abb-6dd651f73dd2"). InnerVolumeSpecName "kube-api-access-z4phh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:05:16.230011 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.229913 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4phh\" (UniqueName: \"kubernetes.io/projected/f6d1ec23-9114-4290-9abb-6dd651f73dd2-kube-api-access-z4phh\") on node \"ip-10-0-129-199.ec2.internal\" DevicePath \"\"" Apr 17 15:05:16.927745 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.927718 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" Apr 17 15:05:16.927745 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.927722 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc" event={"ID":"f6d1ec23-9114-4290-9abb-6dd651f73dd2","Type":"ContainerDied","Data":"65d4464d2cfa02cdb29474168878a33f70cba531df466bf7c8d2ed55a80bbb90"} Apr 17 15:05:16.927940 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:16.927755 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65d4464d2cfa02cdb29474168878a33f70cba531df466bf7c8d2ed55a80bbb90" Apr 17 15:05:17.853927 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:17.853896 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-r8q7b_test-trainjob-8rtp6-node-0-0-xxjkc_f6d1ec23-9114-4290-9abb-6dd651f73dd2/node/0.log" Apr 17 15:05:17.973447 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:17.973413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-hxhcp_test-trainjob-2d482-node-0-0-j4tjd_ee980a35-78fb-4bcb-9b58-e56c3a5aac13/node/0.log" Apr 17 15:05:22.882071 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:22.882031 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc"] Apr 17 15:05:22.888972 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:22.888948 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-r8q7b/test-trainjob-8rtp6-node-0-0-xxjkc"] Apr 17 15:05:23.005487 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:23.005450 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd"] Apr 17 15:05:23.009845 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:23.009814 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-hxhcp/test-trainjob-2d482-node-0-0-j4tjd"] Apr 17 15:05:24.402947 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:24.402913 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee980a35-78fb-4bcb-9b58-e56c3a5aac13" path="/var/lib/kubelet/pods/ee980a35-78fb-4bcb-9b58-e56c3a5aac13/volumes" Apr 17 15:05:24.403344 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:24.403254 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" path="/var/lib/kubelet/pods/f6d1ec23-9114-4290-9abb-6dd651f73dd2/volumes" Apr 17 15:05:35.279735 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:35.279700 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-88gdk_5293946a-fc61-420d-927f-333ff1e97ba3/manager/0.log" Apr 17 15:05:35.690431 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:35.690400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-88gdk_5293946a-fc61-420d-927f-333ff1e97ba3/manager/0.log" Apr 17 15:05:36.133398 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:05:36.133373 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-88gdk_5293946a-fc61-420d-927f-333ff1e97ba3/manager/0.log" Apr 17 15:06:04.462088 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:04.462054 2572 scope.go:117] "RemoveContainer" containerID="d0e2ce5e915c8a1b75b2eff08d428a66c149d9f939cc9d217783799c4f49631b" Apr 17 15:06:11.198812 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.198772 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dfmtr/must-gather-fvttf"] Apr 17 15:06:11.199307 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.199111 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" containerName="node" Apr 17 15:06:11.199307 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.199125 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" containerName="node" Apr 17 15:06:11.199307 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.199183 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6d1ec23-9114-4290-9abb-6dd651f73dd2" containerName="node" Apr 17 15:06:11.202109 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.202084 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.204680 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.204416 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dfmtr\"/\"default-dockercfg-4dlrr\"" Apr 17 15:06:11.204680 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.204484 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dfmtr\"/\"openshift-service-ca.crt\"" Apr 17 15:06:11.204680 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.204507 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dfmtr\"/\"kube-root-ca.crt\"" Apr 17 15:06:11.209910 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.209883 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfmtr/must-gather-fvttf"] Apr 17 15:06:11.279288 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.279245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwnd\" (UniqueName: \"kubernetes.io/projected/3d7e7974-15b1-415c-ae38-416da5dae09e-kube-api-access-jpwnd\") pod \"must-gather-fvttf\" (UID: \"3d7e7974-15b1-415c-ae38-416da5dae09e\") " pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.279463 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.279296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d7e7974-15b1-415c-ae38-416da5dae09e-must-gather-output\") pod \"must-gather-fvttf\" (UID: \"3d7e7974-15b1-415c-ae38-416da5dae09e\") " pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.380190 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.380151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d7e7974-15b1-415c-ae38-416da5dae09e-must-gather-output\") pod \"must-gather-fvttf\" (UID: \"3d7e7974-15b1-415c-ae38-416da5dae09e\") " pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.380386 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.380233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwnd\" (UniqueName: \"kubernetes.io/projected/3d7e7974-15b1-415c-ae38-416da5dae09e-kube-api-access-jpwnd\") pod \"must-gather-fvttf\" (UID: \"3d7e7974-15b1-415c-ae38-416da5dae09e\") " pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.380528 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.380508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3d7e7974-15b1-415c-ae38-416da5dae09e-must-gather-output\") pod \"must-gather-fvttf\" (UID: \"3d7e7974-15b1-415c-ae38-416da5dae09e\") " pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.388283 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.388252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwnd\" (UniqueName: \"kubernetes.io/projected/3d7e7974-15b1-415c-ae38-416da5dae09e-kube-api-access-jpwnd\") pod \"must-gather-fvttf\" (UID: \"3d7e7974-15b1-415c-ae38-416da5dae09e\") " pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.512386 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.512285 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfmtr/must-gather-fvttf" Apr 17 15:06:11.634268 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.634237 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfmtr/must-gather-fvttf"] Apr 17 15:06:11.637255 ip-10-0-129-199 kubenswrapper[2572]: W0417 15:06:11.637225 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d7e7974_15b1_415c_ae38_416da5dae09e.slice/crio-84dbe802c529c60a36433ab60b9f014f5890c06ec07a6d961e2c16d59c20438e WatchSource:0}: Error finding container 84dbe802c529c60a36433ab60b9f014f5890c06ec07a6d961e2c16d59c20438e: Status 404 returned error can't find the container with id 84dbe802c529c60a36433ab60b9f014f5890c06ec07a6d961e2c16d59c20438e Apr 17 15:06:11.638824 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:11.638807 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:06:12.105590 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:12.105552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/must-gather-fvttf" event={"ID":"3d7e7974-15b1-415c-ae38-416da5dae09e","Type":"ContainerStarted","Data":"84dbe802c529c60a36433ab60b9f014f5890c06ec07a6d961e2c16d59c20438e"} Apr 17 15:06:14.115148 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:14.115107 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/must-gather-fvttf" event={"ID":"3d7e7974-15b1-415c-ae38-416da5dae09e","Type":"ContainerStarted","Data":"08bd2d1279a5364afcd603160cfe7ddf5c4c711e86cca15ad9bbb80a58675e71"} Apr 17 15:06:14.115522 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:14.115154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/must-gather-fvttf" event={"ID":"3d7e7974-15b1-415c-ae38-416da5dae09e","Type":"ContainerStarted","Data":"ce209dfef946c545ad53484819ee3ac9f49d3f2569afa814b431ae042bad0658"} Apr 17 15:06:14.129736 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:14.129666 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dfmtr/must-gather-fvttf" podStartSLOduration=1.266040955 podStartE2EDuration="3.12964833s" podCreationTimestamp="2026-04-17 15:06:11 +0000 UTC" firstStartedPulling="2026-04-17 15:06:11.638957007 +0000 UTC m=+2407.830556105" lastFinishedPulling="2026-04-17 15:06:13.502564374 +0000 UTC m=+2409.694163480" observedRunningTime="2026-04-17 15:06:14.128429023 +0000 UTC m=+2410.320028140" watchObservedRunningTime="2026-04-17 15:06:14.12964833 +0000 UTC m=+2410.321247451" Apr 17 15:06:14.931528 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:14.931498 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dxl24_118312db-c4a8-48d8-a9c7-85eeef8a0b7b/global-pull-secret-syncer/0.log" Apr 17 15:06:15.047490 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:15.047453 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7n9xk_1391e92c-990f-482b-b724-0f08633e4abf/konnectivity-agent/0.log" Apr 17 15:06:15.126174 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:15.126146 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-199.ec2.internal_87d4ca475b53fa90f2c794fc65d796bc/haproxy/0.log" Apr 17 15:06:18.731212 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:18.731173 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-htmfm_d9c2287a-52d3-4ad1-91d5-cb34daf99223/kube-state-metrics/0.log" Apr 17 15:06:18.747709 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:18.747685 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-htmfm_d9c2287a-52d3-4ad1-91d5-cb34daf99223/kube-rbac-proxy-main/0.log" Apr 17 15:06:18.769677 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:18.769640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-htmfm_d9c2287a-52d3-4ad1-91d5-cb34daf99223/kube-rbac-proxy-self/0.log" Apr 17 15:06:18.964480 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:18.964435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qm6nf_f6dad2fc-6268-43d6-8662-8a6c266e601f/node-exporter/0.log" Apr 17 15:06:18.980945 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:18.980917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qm6nf_f6dad2fc-6268-43d6-8662-8a6c266e601f/kube-rbac-proxy/0.log" Apr 17 15:06:18.999898 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:18.999830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qm6nf_f6dad2fc-6268-43d6-8662-8a6c266e601f/init-textfile/0.log" Apr 17 15:06:21.239568 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.239531 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c58886c8c-r75dc_b7b27802-a2a3-4521-ae52-cd1e1ddc60d7/console/0.log" Apr 17 15:06:21.272367 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.272330 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-2zg74_ed4f570a-52ab-4296-95f4-b904558a1925/download-server/0.log" Apr 17 15:06:21.588899 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.588862 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm"] Apr 17 15:06:21.592822 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.592794 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.604268 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.604239 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm"] Apr 17 15:06:21.635291 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.635262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-hzkc8_5bf7af28-4310-4da2-a1f2-e10c3b867cc6/volume-data-source-validator/0.log" Apr 17 15:06:21.670156 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.670115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-lib-modules\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.670156 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.670156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-sys\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.670412 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.670197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprzc\" (UniqueName: \"kubernetes.io/projected/e6f4a019-68b5-479f-85e7-a0d247a3c271-kube-api-access-cprzc\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.670412 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.670319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-podres\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.670412 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.670366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-proc\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771135 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-lib-modules\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771135 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-sys\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cprzc\" (UniqueName: \"kubernetes.io/projected/e6f4a019-68b5-479f-85e7-a0d247a3c271-kube-api-access-cprzc\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-podres\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-proc\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-sys\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-lib-modules\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-proc\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.771364 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.771355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e6f4a019-68b5-479f-85e7-a0d247a3c271-podres\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.779193 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.779170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprzc\" (UniqueName: \"kubernetes.io/projected/e6f4a019-68b5-479f-85e7-a0d247a3c271-kube-api-access-cprzc\") pod \"perf-node-gather-daemonset-k99gm\" (UID: \"e6f4a019-68b5-479f-85e7-a0d247a3c271\") " pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:21.907211 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:21.907126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:22.063209 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.063050 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm"] Apr 17 15:06:22.065609 ip-10-0-129-199 kubenswrapper[2572]: W0417 15:06:22.065582 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode6f4a019_68b5_479f_85e7_a0d247a3c271.slice/crio-b746b9f34073dd996fba2e0462d0a8969eba8753bc99dcdf05d94deae90990e5 WatchSource:0}: Error finding container b746b9f34073dd996fba2e0462d0a8969eba8753bc99dcdf05d94deae90990e5: Status 404 returned error can't find the container with id b746b9f34073dd996fba2e0462d0a8969eba8753bc99dcdf05d94deae90990e5 Apr 17 15:06:22.147832 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.147794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" event={"ID":"e6f4a019-68b5-479f-85e7-a0d247a3c271","Type":"ContainerStarted","Data":"aa13a570774b3e7e679fd8256aa554f535bed8c3b2a5a23a66be7a4a12e286a8"} Apr 17 15:06:22.147985 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.147839 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" event={"ID":"e6f4a019-68b5-479f-85e7-a0d247a3c271","Type":"ContainerStarted","Data":"b746b9f34073dd996fba2e0462d0a8969eba8753bc99dcdf05d94deae90990e5"} Apr 17 15:06:22.147985 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.147928 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:22.164015 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.163898 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" podStartSLOduration=1.163885113 podStartE2EDuration="1.163885113s" podCreationTimestamp="2026-04-17 15:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:06:22.162534434 +0000 UTC m=+2418.354133568" watchObservedRunningTime="2026-04-17 15:06:22.163885113 +0000 UTC m=+2418.355484232" Apr 17 15:06:22.255002 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.254954 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qcjlc_a7611fae-10f2-4c93-a3ad-2f23c7e94788/dns/0.log" Apr 17 15:06:22.271908 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.271881 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qcjlc_a7611fae-10f2-4c93-a3ad-2f23c7e94788/kube-rbac-proxy/0.log" Apr 17 15:06:22.384723 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.384695 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l87rd_4fe5daab-b27c-4fc1-88c0-72d89cdccb01/dns-node-resolver/0.log" Apr 17 15:06:22.756776 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.756742 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-d97df5bf7-xlprx_bff4a2a5-e4e7-440f-8284-b551ea59635d/registry/0.log" Apr 17 15:06:22.777218 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:22.777193 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h8bbt_6cb28525-d291-4976-8923-d29d0747a7dd/node-ca/0.log" Apr 17 15:06:23.433624 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:23.433584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5449587f4d-66vdl_40d17327-7330-4bba-8349-541f00eae77e/router/0.log" Apr 17 15:06:23.713712 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:23.713629 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cwbck_57f5fb50-c788-4c19-a249-06bd9abaa213/serve-healthcheck-canary/0.log" Apr 17 15:06:24.202499 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:24.202471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r9rw5_7ba19e6f-9ed4-48ef-b5e5-6417d0306a13/kube-rbac-proxy/0.log" Apr 17 15:06:24.218750 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:24.218721 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r9rw5_7ba19e6f-9ed4-48ef-b5e5-6417d0306a13/exporter/0.log" Apr 17 15:06:24.235330 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:24.235304 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r9rw5_7ba19e6f-9ed4-48ef-b5e5-6417d0306a13/extractor/0.log" Apr 17 15:06:28.164934 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:28.164325 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dfmtr/perf-node-gather-daemonset-k99gm" Apr 17 15:06:28.788479 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:28.788450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-297rk_7e8de3fc-6fef-4f2f-983d-9b545b473bf5/kube-storage-version-migrator-operator/1.log" Apr 17 15:06:28.790043 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:28.790021 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-297rk_7e8de3fc-6fef-4f2f-983d-9b545b473bf5/kube-storage-version-migrator-operator/0.log" Apr 17 15:06:29.584700 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:29.584667 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7rgwd_f1db9491-104f-4aed-8ac3-5c92d3bf1803/kube-multus/0.log" Apr 17 15:06:29.952442 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:29.952412 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/kube-multus-additional-cni-plugins/0.log" Apr 17 15:06:29.970447 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:29.970416 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/egress-router-binary-copy/0.log" Apr 17 15:06:29.988795 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:29.988769 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/cni-plugins/0.log" Apr 17 15:06:30.006558 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:30.006526 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/bond-cni-plugin/0.log" Apr 17 15:06:30.024649 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:30.024622 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/routeoverride-cni/0.log" Apr 17 15:06:30.042399 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:30.042372 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/whereabouts-cni-bincopy/0.log" Apr 17 15:06:30.060116 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:30.060089 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lw47q_7cbba96f-e882-4326-b2b6-5a647245170e/whereabouts-cni/0.log" Apr 17 15:06:30.211070 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:30.210964 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tr5b8_85225238-59be-4245-9db5-4b8fa923bf28/network-metrics-daemon/0.log" Apr 17 15:06:30.229332 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:30.229303 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tr5b8_85225238-59be-4245-9db5-4b8fa923bf28/kube-rbac-proxy/0.log" Apr 17 15:06:31.008614 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.008579 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/ovn-controller/0.log" Apr 17 15:06:31.039129 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.039093 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/ovn-acl-logging/0.log" Apr 17 15:06:31.060663 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.060620 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/kube-rbac-proxy-node/0.log" Apr 17 15:06:31.081176 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.081129 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 15:06:31.098533 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.098488 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/northd/0.log" Apr 17 15:06:31.123756 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.123719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/nbdb/0.log" Apr 17 15:06:31.145869 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.145839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/sbdb/0.log" Apr 17 15:06:31.278143 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:31.278065 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7sxz5_049f13c9-ae95-43ce-9dac-c99d4c07f258/ovnkube-controller/0.log" Apr 17 15:06:32.711233 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:32.711157 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-ppng7_8064c346-7dfa-4cd6-b958-bda83c725a8f/check-endpoints/0.log" Apr 17 15:06:32.754223 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:32.754183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fg9xk_8ce8568a-309e-41ba-9f5d-bd4386d9dcea/network-check-target-container/0.log" Apr 17 15:06:33.592830 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:33.592800 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-k5l86_a660c0da-e639-42a0-9f51-34051bd62e42/iptables-alerter/0.log" Apr 17 15:06:34.139398 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:34.139368 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hgwmv_6cef70aa-7a75-4dad-8f4c-b08582ff5b85/tuned/0.log" Apr 17 15:06:35.784576 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:35.784545 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-tfwqn_54b6db51-e6d2-4085-892d-0926bc9c6cea/cluster-samples-operator/0.log" Apr 17 15:06:35.798686 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:35.798663 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-tfwqn_54b6db51-e6d2-4085-892d-0926bc9c6cea/cluster-samples-operator-watch/0.log" Apr 17 15:06:37.306832 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:37.306795 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dg7bw_9244d507-858e-449a-af7a-1e8df83a35ae/csi-driver/0.log" Apr 17 15:06:37.324257 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:37.324227 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dg7bw_9244d507-858e-449a-af7a-1e8df83a35ae/csi-node-driver-registrar/0.log" Apr 17 15:06:37.341025 ip-10-0-129-199 kubenswrapper[2572]: I0417 15:06:37.341005 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-dg7bw_9244d507-858e-449a-af7a-1e8df83a35ae/csi-liveness-probe/0.log"