Apr 17 18:10:06.996601 ip-10-0-133-142 systemd[1]: Starting Kubernetes Kubelet... Apr 17 18:10:07.442468 ip-10-0-133-142 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:10:07.442468 ip-10-0-133-142 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 18:10:07.442468 ip-10-0-133-142 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:10:07.442468 ip-10-0-133-142 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 18:10:07.442468 ip-10-0-133-142 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:10:07.446066 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.445802 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 18:10:07.449260 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449244 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:07.449260 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449260 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449263 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449268 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449292 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449297 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449300 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449302 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449305 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449307 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449310 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449313 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449316 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449318 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449321 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449323 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449326 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449328 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449331 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449333 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449335 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:07.449350 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449338 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449341 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449343 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449348 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449352 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449355 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449358 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449361 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449364 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449370 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449373 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449376 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449378 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449381 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449383 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449386 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449388 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449390 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449393 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449396 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:07.449847 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449398 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449401 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449403 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449406 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449408 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449412 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449414 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449417 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449419 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449422 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449425 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449427 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449430 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449433 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449436 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449439 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449441 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449444 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449447 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449449 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:07.450342 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449452 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449454 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449457 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449459 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449462 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449464 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449467 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449470 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449472 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449476 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449480 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449483 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449487 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449491 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449494 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449497 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449499 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449502 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449505 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:07.450819 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449507 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449510 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449512 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449514 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449517 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449519 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449935 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449942 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449945 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449949 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449953 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449956 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449959 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449962 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449965 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449968 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449970 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449973 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449975 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:07.451261 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449978 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449980 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449983 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449986 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449988 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449991 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449993 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449995 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.449998 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450001 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450004 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450006 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450009 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450012 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450015 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450017 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450020 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450022 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450025 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:07.451820 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450028 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450030 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450034 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450036 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450038 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450041 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450043 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450046 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450048 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450051 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450053 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450056 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450058 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450060 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450063 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450065 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450067 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450070 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450073 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450075 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:07.452295 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450078 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450080 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450083 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450085 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450088 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450090 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450093 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450095 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450097 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450100 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450102 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450105 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450107 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450112 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450115 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450117 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450120 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450122 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450124 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450127 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:07.452781 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450129 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450132 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450134 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450137 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450141 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450143 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450146 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450148 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450151 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450154 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450156 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450160 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450164 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.450167 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451761 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451777 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451784 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451788 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451793 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451796 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451800 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 18:10:07.453250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451805 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451808 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451811 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451815 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451819 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451822 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451825 2583 flags.go:64] FLAG: --cgroup-root="" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451828 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451831 2583 flags.go:64] FLAG: --client-ca-file="" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451834 2583 flags.go:64] FLAG: --cloud-config="" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451837 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451840 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451843 2583 flags.go:64] FLAG: --cluster-domain="" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451846 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451849 2583 flags.go:64] FLAG: --config-dir="" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451852 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451855 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451859 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451862 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451866 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451869 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451872 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451875 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451878 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451882 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 18:10:07.453802 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451884 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451889 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451892 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451895 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451898 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451901 2583 flags.go:64] FLAG: --enable-server="true" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451904 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451908 2583 flags.go:64] FLAG: --event-burst="100" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451911 2583 flags.go:64] FLAG: --event-qps="50" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451914 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451917 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451920 2583 flags.go:64] FLAG: --eviction-hard="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451924 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451927 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451930 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451933 2583 flags.go:64] FLAG: --eviction-soft="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451936 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451939 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451942 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451945 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451947 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451950 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451953 2583 flags.go:64] FLAG: --feature-gates="" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451957 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451960 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 18:10:07.454413 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451963 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451966 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451969 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451972 2583 flags.go:64] FLAG: --help="false" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451975 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451978 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451982 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451984 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451988 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451992 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451996 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.451999 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452002 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452005 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452008 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452011 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452014 2583 flags.go:64] FLAG: --kube-reserved="" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452017 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452020 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452023 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452026 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452029 2583 flags.go:64] FLAG: --lock-file="" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452032 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452035 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 18:10:07.455035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452038 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452043 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452046 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452050 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452052 2583 flags.go:64] FLAG: --logging-format="text" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452055 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452059 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452062 2583 flags.go:64] FLAG: --manifest-url="" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452066 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452070 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452073 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452077 2583 flags.go:64] FLAG: --max-pods="110" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452080 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452083 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452086 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452090 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452093 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452096 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452099 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452107 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452110 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452113 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452116 2583 flags.go:64] FLAG: --pod-cidr="" Apr 17 18:10:07.455620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452119 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452125 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452128 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452131 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452134 2583 flags.go:64] FLAG: --port="10250" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452138 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452141 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00f4d3d7b88f0d8e5" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452144 2583 flags.go:64] FLAG: --qos-reserved="" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452147 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452150 2583 flags.go:64] FLAG: --register-node="true" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452153 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452156 2583 flags.go:64] FLAG: --register-with-taints="" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452159 2583 flags.go:64] FLAG: --registry-burst="10" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452162 2583 flags.go:64] FLAG: --registry-qps="5" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452165 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452168 2583 flags.go:64] FLAG: --reserved-memory="" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452171 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452174 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452178 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452181 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452183 2583 flags.go:64] FLAG: --runonce="false" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452186 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452189 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452192 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452195 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452198 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 18:10:07.456160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452201 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452204 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452207 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452211 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452213 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452216 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452219 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452222 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452225 2583 flags.go:64] FLAG: --system-cgroups="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452228 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452236 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452239 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452242 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452246 2583 flags.go:64] FLAG: --tls-min-version="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452249 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452252 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452255 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452257 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452260 2583 flags.go:64] FLAG: --v="2" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452264 2583 flags.go:64] FLAG: --version="false" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452268 2583 flags.go:64] FLAG: --vmodule="" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452289 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.452292 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452395 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:07.456818 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452399 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452403 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452406 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452408 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452411 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452414 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452417 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452420 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452423 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452426 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452429 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452432 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452435 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452437 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452440 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452442 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452445 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452447 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452453 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:07.457407 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452457 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452460 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452462 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452465 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452468 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452471 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452473 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452476 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452479 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452481 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452484 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452487 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452489 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452492 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452496 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452499 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452502 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452505 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452507 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452510 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:07.457870 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452513 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452515 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452517 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452520 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452523 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452525 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452528 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452531 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452533 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452535 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452538 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452542 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452544 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452547 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452549 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452552 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452554 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452557 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452559 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:07.458382 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452562 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452564 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452567 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452569 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452572 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452574 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452577 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452579 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452582 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452584 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452586 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452589 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452591 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452594 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452596 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452599 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452601 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452604 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452606 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452608 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:07.458844 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452611 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452614 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452616 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452618 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452622 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452625 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.452627 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:07.459349 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.453301 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:10:07.460304 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.460268 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 18:10:07.460339 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.460305 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460353 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460359 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460363 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460366 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460369 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460372 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:07.460372 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460375 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460378 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460381 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460384 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460387 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460389 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460392 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460394 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460397 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460399 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460402 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460405 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460407 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460411 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460413 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460416 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460419 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460422 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460424 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:07.460583 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460427 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460429 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460431 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460434 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460437 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460439 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460442 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460445 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460449 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460454 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460456 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460460 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460462 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460465 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460467 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460470 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460473 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460476 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460478 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460481 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:07.461042 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460484 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460486 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460489 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460491 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460494 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460497 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460499 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460502 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460504 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460507 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460509 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460512 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460514 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460517 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460519 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460522 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460524 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460527 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460529 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460532 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:07.461554 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460535 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460537 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460540 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460544 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460547 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460550 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460552 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460555 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460558 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460560 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460563 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460565 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460567 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460570 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460573 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460575 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460578 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460580 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460583 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:07.462047 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460586 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460590 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.460596 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460698 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460704 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460708 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460712 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460715 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460718 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460721 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460723 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460726 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460728 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460731 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460734 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:07.462665 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460737 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460740 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460743 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460745 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460748 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460750 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460753 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460755 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460758 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460761 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460763 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460766 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460769 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460772 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460774 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460776 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460779 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460782 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460786 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:07.463031 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460789 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460791 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460794 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460797 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460799 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460802 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460804 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460806 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460809 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460811 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460814 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460816 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460819 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460822 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460826 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460828 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460831 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460833 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460835 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460838 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:07.463508 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460840 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460843 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460845 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460848 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460850 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460853 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460855 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460857 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460860 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460862 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460865 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460867 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460869 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460872 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460874 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460876 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460879 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460881 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460884 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:07.463981 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460886 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460888 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460891 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460893 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460896 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460898 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460901 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460904 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460906 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460909 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460912 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460914 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460917 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460919 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460921 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:07.460924 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:07.464441 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.460929 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:10:07.464828 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.461613 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 18:10:07.465641 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.465621 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 18:10:07.466679 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.466667 2583 server.go:1019] "Starting client certificate rotation" Apr 17 18:10:07.466775 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.466761 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:10:07.467703 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.467691 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:10:07.492717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.492694 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:10:07.499064 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.499039 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:10:07.516685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.516657 2583 log.go:25] "Validated CRI v1 runtime API" Apr 17 18:10:07.524613 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.524588 2583 log.go:25] "Validated CRI v1 image API" Apr 17 18:10:07.526059 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.526042 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 18:10:07.530198 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.530167 2583 fs.go:135] Filesystem UUIDs: map[320abec6-d11b-4964-af45-13884739358e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 823be212-1867-47c0-b16e-674e24a57970:/dev/nvme0n1p4] Apr 17 18:10:07.530304 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.530192 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 18:10:07.534417 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.534400 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:10:07.536022 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.535911 2583 manager.go:217] Machine: {Timestamp:2026-04-17 18:10:07.533930632 +0000 UTC m=+0.409323333 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3091852 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec233f5108232058ca94b0d5de7ca7b0 SystemUUID:ec233f51-0823-2058-ca94-b0d5de7ca7b0 BootID:0466bf16-ae4f-415c-8d5c-aba2cc540aad Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:97:19:57:d9:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:97:19:57:d9:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:d3:2f:20:39:80 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 18:10:07.536022 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.536019 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 18:10:07.536173 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.536149 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 18:10:07.537237 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.537217 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 18:10:07.537417 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.537240 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 18:10:07.537465 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.537427 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 18:10:07.537465 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.537436 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 18:10:07.537465 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.537453 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:10:07.538385 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.538374 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:10:07.540313 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.540302 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:10:07.540605 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.540596 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 18:10:07.542946 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.542936 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 17 18:10:07.542986 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.542952 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 18:10:07.542986 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.542966 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 18:10:07.542986 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.542975 2583 kubelet.go:397] "Adding apiserver pod source" Apr 17 18:10:07.542986 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.542984 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 18:10:07.544135 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.544123 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:10:07.544201 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.544140 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:10:07.548078 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.548059 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 18:10:07.550415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.550401 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 18:10:07.551662 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551650 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551667 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551674 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551679 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551693 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551699 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551705 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 18:10:07.551710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551711 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 18:10:07.551941 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551718 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 18:10:07.551941 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551724 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 18:10:07.551941 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551740 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 18:10:07.551941 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.551749 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 18:10:07.552619 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.552610 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 18:10:07.552619 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.552619 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 18:10:07.555060 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.555032 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 18:10:07.555239 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.555211 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 18:10:07.557384 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.557366 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 18:10:07.557459 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.557454 2583 server.go:1295] "Started kubelet" Apr 17 18:10:07.557563 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.557533 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 18:10:07.557671 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.557614 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 18:10:07.557715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.557704 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 18:10:07.558425 ip-10-0-133-142 systemd[1]: Started Kubernetes Kubelet. Apr 17 18:10:07.558907 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.558848 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sblrc" Apr 17 18:10:07.560205 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.560186 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 17 18:10:07.562426 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.562395 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 18:10:07.564050 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.564029 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sblrc" Apr 17 18:10:07.565333 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.565315 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 18:10:07.566393 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.565446 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-142.ec2.internal.18a73752fae92bf9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-142.ec2.internal,UID:ip-10-0-133-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-142.ec2.internal,},FirstTimestamp:2026-04-17 18:10:07.557381113 +0000 UTC m=+0.432773814,LastTimestamp:2026-04-17 18:10:07.557381113 +0000 UTC m=+0.432773814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-142.ec2.internal,}" Apr 17 18:10:07.566988 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.566966 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 18:10:07.570470 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.570451 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 18:10:07.571132 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.571115 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 18:10:07.572045 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.572028 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:07.572132 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572117 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 18:10:07.572132 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572117 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 18:10:07.572196 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572140 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 18:10:07.572252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572243 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 17 18:10:07.572315 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572253 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 17 18:10:07.572353 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572313 2583 factory.go:55] Registering systemd factory Apr 17 18:10:07.572353 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572333 2583 factory.go:223] Registration of the systemd container factory successfully Apr 17 18:10:07.572555 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572539 2583 factory.go:153] Registering CRI-O factory Apr 17 18:10:07.572555 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572555 2583 factory.go:223] Registration of the crio container factory successfully Apr 17 18:10:07.572700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572607 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 18:10:07.572700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572627 2583 factory.go:103] Registering Raw factory Apr 17 18:10:07.572700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.572639 2583 manager.go:1196] Started watching for new ooms in manager Apr 17 18:10:07.573531 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.573508 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:07.573686 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.573675 2583 manager.go:319] Starting recovery of all containers Apr 17 18:10:07.576204 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.576177 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-142.ec2.internal\" not found" node="ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.583934 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.583901 2583 manager.go:324] Recovery completed Apr 17 18:10:07.589064 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.589045 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:07.591715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.591698 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:07.591774 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.591731 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:07.591774 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.591742 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:07.592240 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.592225 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 18:10:07.592302 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.592240 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 18:10:07.592302 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.592257 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:10:07.596106 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.596092 2583 policy_none.go:49] "None policy: Start" Apr 17 18:10:07.596161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.596109 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 18:10:07.596161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.596121 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.637918 2583 manager.go:341] "Starting Device Plugin manager" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.638024 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.638038 2583 server.go:85] "Starting device plugin registration server" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.638385 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.638398 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.638500 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.638596 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.638605 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.639127 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 18:10:07.653252 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.639168 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:07.701636 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.701539 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 18:10:07.702902 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.702886 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 18:10:07.702953 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.702916 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 18:10:07.702953 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.702936 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 18:10:07.702953 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.702943 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 18:10:07.703069 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.702976 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 18:10:07.705256 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.705230 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:07.739238 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.739200 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:07.740854 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.740836 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:07.740926 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.740874 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:07.740926 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.740885 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:07.740926 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.740912 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.749481 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.749463 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.749537 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.749491 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-142.ec2.internal\": node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:07.768284 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.768245 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:07.803098 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.803034 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal"] Apr 17 18:10:07.803171 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.803144 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:07.804131 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.804116 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:07.804184 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.804146 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:07.804184 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.804157 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:07.806458 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.806447 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:07.806628 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.806615 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.806668 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.806645 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:07.807538 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.807520 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:07.807640 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.807551 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:07.807640 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.807527 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:07.807640 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.807562 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:07.807640 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.807582 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:07.807640 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.807593 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:07.809788 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.809770 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.809847 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.809810 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:07.810606 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.810589 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:07.810686 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.810616 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:07.810686 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.810625 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:07.839095 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.839071 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-142.ec2.internal\" not found" node="ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.843502 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.843485 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-142.ec2.internal\" not found" node="ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.868733 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.868704 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:07.874554 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.874514 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/acff7d4a666780b2ff38ca9f5fa48a1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal\" (UID: \"acff7d4a666780b2ff38ca9f5fa48a1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.874554 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.874555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/acff7d4a666780b2ff38ca9f5fa48a1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal\" (UID: \"acff7d4a666780b2ff38ca9f5fa48a1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.874704 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.874573 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/652ed875b090027ea0cb6468dfc50153-config\") pod \"kube-apiserver-proxy-ip-10-0-133-142.ec2.internal\" (UID: \"652ed875b090027ea0cb6468dfc50153\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.969380 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:07.969254 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:07.975644 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.975617 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/acff7d4a666780b2ff38ca9f5fa48a1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal\" (UID: \"acff7d4a666780b2ff38ca9f5fa48a1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.975741 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.975657 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/acff7d4a666780b2ff38ca9f5fa48a1a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal\" (UID: \"acff7d4a666780b2ff38ca9f5fa48a1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.975741 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.975712 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/acff7d4a666780b2ff38ca9f5fa48a1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal\" (UID: \"acff7d4a666780b2ff38ca9f5fa48a1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.975741 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.975731 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/652ed875b090027ea0cb6468dfc50153-config\") pod \"kube-apiserver-proxy-ip-10-0-133-142.ec2.internal\" (UID: \"652ed875b090027ea0cb6468dfc50153\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.975922 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.975779 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/652ed875b090027ea0cb6468dfc50153-config\") pod \"kube-apiserver-proxy-ip-10-0-133-142.ec2.internal\" (UID: \"652ed875b090027ea0cb6468dfc50153\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" Apr 17 18:10:07.975922 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:07.975828 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/acff7d4a666780b2ff38ca9f5fa48a1a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal\" (UID: \"acff7d4a666780b2ff38ca9f5fa48a1a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:08.070080 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:08.070030 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:08.141547 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.141518 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:08.146647 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.146627 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" Apr 17 18:10:08.170167 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:08.170135 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:08.270700 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:08.270601 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:08.371180 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:08.371148 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:08.466602 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.466563 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 18:10:08.467254 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.466751 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:10:08.467254 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.466757 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:10:08.471738 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:08.471704 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-142.ec2.internal\" not found" Apr 17 18:10:08.566577 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.566483 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 18:05:07 +0000 UTC" deadline="2027-11-24 07:44:59.142607697 +0000 UTC" Apr 17 18:10:08.566577 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.566516 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14053h34m50.576095563s" Apr 17 18:10:08.570585 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.570563 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 18:10:08.571071 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.571054 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:08.572484 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.572471 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" Apr 17 18:10:08.580670 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.580652 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:10:08.582228 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.582211 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" Apr 17 18:10:08.586225 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.586204 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:10:08.592048 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.592031 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:10:08.607151 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.607123 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qjbcd" Apr 17 18:10:08.615031 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.615007 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qjbcd" Apr 17 18:10:08.788816 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:08.788779 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacff7d4a666780b2ff38ca9f5fa48a1a.slice/crio-bb873af36a9834850a1c21017bbab9d9af32ba6c6e307d325348ec7f32fe9bb7 WatchSource:0}: Error finding container bb873af36a9834850a1c21017bbab9d9af32ba6c6e307d325348ec7f32fe9bb7: Status 404 returned error can't find the container with id bb873af36a9834850a1c21017bbab9d9af32ba6c6e307d325348ec7f32fe9bb7 Apr 17 18:10:08.789142 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:08.789122 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652ed875b090027ea0cb6468dfc50153.slice/crio-fb6c6f7e19721bf6fed04e673f06f5386e99f2f12c9578dd92b2d457ee14600f WatchSource:0}: Error finding container fb6c6f7e19721bf6fed04e673f06f5386e99f2f12c9578dd92b2d457ee14600f: Status 404 returned error can't find the container with id fb6c6f7e19721bf6fed04e673f06f5386e99f2f12c9578dd92b2d457ee14600f Apr 17 18:10:08.792804 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.792790 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:10:08.904955 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:08.904870 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:09.461018 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.460967 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:09.544229 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.544202 2583 apiserver.go:52] "Watching apiserver" Apr 17 18:10:09.553133 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.553104 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 18:10:09.553561 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.553534 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9nr2v","openshift-multus/multus-additional-cni-plugins-sj2s9","openshift-multus/network-metrics-daemon-6d44x","openshift-ovn-kubernetes/ovnkube-node-ql5pl","kube-system/konnectivity-agent-st4n8","kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d","openshift-cluster-node-tuning-operator/tuned-7tsjs","openshift-network-diagnostics/network-check-target-kgjhh","openshift-network-operator/iptables-alerter-45lrm","kube-system/global-pull-secret-syncer-pqbjx","openshift-dns/node-resolver-rl4d7","openshift-image-registry/node-ca-8pwg5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal"] Apr 17 18:10:09.556893 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.556870 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.559566 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.559543 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.559684 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.559550 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 18:10:09.559684 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.559571 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kjkrk\"" Apr 17 18:10:09.559915 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.559899 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.561047 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.561029 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:09.561394 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.561376 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:09.561486 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.561451 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:09.562021 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.561613 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.563815 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.563794 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.563988 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.563968 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.564139 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.564125 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qd9lw\"" Apr 17 18:10:09.564405 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.564385 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 18:10:09.564582 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.564568 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.566400 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.565951 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 18:10:09.566400 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.566064 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.566400 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.566172 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 18:10:09.566616 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.566467 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.566616 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.566489 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lnzgm\"" Apr 17 18:10:09.569730 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.568822 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.569730 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.568974 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 18:10:09.569730 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.569214 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 18:10:09.569730 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.569548 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 18:10:09.569932 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.569737 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zm7fs\"" Apr 17 18:10:09.569969 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.569918 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 18:10:09.571326 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.570761 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.572727 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.572705 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.572913 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.572894 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7jk2w\"" Apr 17 18:10:09.573171 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.573154 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:09.573264 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.573214 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:09.573600 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.573580 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.577902 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.575773 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.577902 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.576460 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.579717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.578875 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 18:10:09.579717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.579303 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.579717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.579444 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d2h25\"" Apr 17 18:10:09.579717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.579579 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.579982 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.579859 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.580193 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.580165 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 18:10:09.580472 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.580447 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-k5lpw\"" Apr 17 18:10:09.580573 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.580521 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 18:10:09.581243 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.581221 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 18:10:09.582029 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.582006 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.582429 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.582412 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7rnz\"" Apr 17 18:10:09.582676 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.582660 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 18:10:09.582771 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.582753 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.584215 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584192 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-run-ovn-kubernetes\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584373 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584230 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-device-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.584373 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584255 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-run-netns\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584373 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584295 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce6edf92-e02a-474f-864a-5bd1153d24d6-agent-certs\") pod \"konnectivity-agent-st4n8\" (UID: \"ce6edf92-e02a-474f-864a-5bd1153d24d6\") " pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.584373 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584327 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-conf-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.584373 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584353 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-systemd\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584376 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b758c69-286f-4851-8bab-2922e791af32-ovn-node-metrics-cert\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584400 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-os-release\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584423 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584446 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-ovn\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584470 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584495 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584537 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysctl-d\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584569 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-var-lib-kubelet\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584595 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-tuned\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.584631 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584622 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584647 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75b35535-d264-462c-a620-1b59e57c1eef-host\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584673 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce7310b9-648a-4042-86fe-ef118fc7af4e-cni-binary-copy\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584704 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-socket-dir-parent\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584734 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-cni-bin\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584758 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-daemon-config\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584780 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/75b35535-d264-462c-a620-1b59e57c1eef-serviceca\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584806 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-ovnkube-config\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584827 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-sys-fs\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584850 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzmjv\" (UniqueName: \"kubernetes.io/projected/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-kube-api-access-pzmjv\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584881 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-cnibin\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584903 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-slash\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584924 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-env-overrides\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584957 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-modprobe-d\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.584984 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-run\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585003 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-hostroot\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585021 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:09.585073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585064 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-registration-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585093 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-tmp\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585119 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-netns\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585143 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-etc-kubernetes\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585193 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8vm\" (UniqueName: \"kubernetes.io/projected/75b35535-d264-462c-a620-1b59e57c1eef-kube-api-access-fs8vm\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vphqj\" (UniqueName: \"kubernetes.io/projected/2ea7fcad-19ae-42ab-8026-113afe4c2f23-kube-api-access-vphqj\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-var-lib-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585323 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-systemd\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585349 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-systemd-units\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585373 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce6edf92-e02a-474f-864a-5bd1153d24d6-konnectivity-ca\") pod \"konnectivity-agent-st4n8\" (UID: \"ce6edf92-e02a-474f-864a-5bd1153d24d6\") " pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585397 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysctl-conf\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-lib-modules\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585443 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-node-log\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585465 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-log-socket\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585488 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-ovnkube-script-lib\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585517 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbvws\" (UniqueName: \"kubernetes.io/projected/3b758c69-286f-4851-8bab-2922e791af32-kube-api-access-dbvws\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.585795 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585539 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585561 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-host\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585587 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-k8s-cni-cncf-io\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585623 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-cni-netd\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585649 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6b7z\" (UniqueName: \"kubernetes.io/projected/9d5e6927-1286-418f-9057-4d85494090c2-kube-api-access-r6b7z\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585666 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-kubernetes\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585683 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-sys\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585706 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-cni-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585723 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-cni-multus\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585736 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-multus-certs\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585749 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-kubelet\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585765 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-cni-bin\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585786 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-kubelet\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585809 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-socket-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585827 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysconfig\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585847 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-system-cni-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585876 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7hc\" (UniqueName: \"kubernetes.io/projected/ce7310b9-648a-4042-86fe-ef118fc7af4e-kube-api-access-pt7hc\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.586396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.585904 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-etc-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.588332 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.588294 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.588525 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.588300 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.588592 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.588569 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:09.590232 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.590214 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 18:10:09.590580 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.590564 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 18:10:09.590671 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.590628 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sk7kv\"" Apr 17 18:10:09.617730 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.617682 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:05:08 +0000 UTC" deadline="2027-12-17 21:00:43.623017389 +0000 UTC" Apr 17 18:10:09.617730 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.617714 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14618h50m34.005306468s" Apr 17 18:10:09.673491 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.673451 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 18:10:09.686669 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686638 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-etc-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686675 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-run-ovn-kubernetes\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686694 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-device-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686712 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-os-release\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686761 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6vtd\" (UniqueName: \"kubernetes.io/projected/854b67d6-4dbb-4558-9444-235eb53b9278-kube-api-access-l6vtd\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686769 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-etc-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686783 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-run-ovn-kubernetes\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.686811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686788 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-run-netns\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686831 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-run-netns\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686809 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-device-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686848 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce6edf92-e02a-474f-864a-5bd1153d24d6-agent-certs\") pod \"konnectivity-agent-st4n8\" (UID: \"ce6edf92-e02a-474f-864a-5bd1153d24d6\") " pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686898 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-conf-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686930 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1ab1f86f-6203-4036-b656-aeacb3b958ba-iptables-alerter-script\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686958 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-conf-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686963 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-systemd\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.686998 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-systemd\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687004 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b758c69-286f-4851-8bab-2922e791af32-ovn-node-metrics-cert\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687030 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-os-release\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687056 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzqn\" (UniqueName: \"kubernetes.io/projected/1ab1f86f-6203-4036-b656-aeacb3b958ba-kube-api-access-lrzqn\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687073 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687094 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687128 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687742 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687231 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-ovn\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687742 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687580 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 18:10:09.687742 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687687 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.687742 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.687911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687771 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysctl-d\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.687911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687804 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-var-lib-kubelet\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.687911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687835 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-tuned\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.687911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687862 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:09.687911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687891 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75b35535-d264-462c-a620-1b59e57c1eef-host\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.687911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687500 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-os-release\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687921 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce7310b9-648a-4042-86fe-ef118fc7af4e-cni-binary-copy\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-socket-dir-parent\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687966 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.687985 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-cni-bin\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688013 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-daemon-config\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688029 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688051 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbnv\" (UniqueName: \"kubernetes.io/projected/b5e8e12b-b8af-4a75-8a47-fc6090430623-kube-api-access-flbnv\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/75b35535-d264-462c-a620-1b59e57c1eef-serviceca\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-ovnkube-config\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.688176 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688164 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysctl-d\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688230 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-sys-fs\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688240 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-var-lib-kubelet\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688336 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-socket-dir-parent\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.688383 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-cni-bin\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689190 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/75b35535-d264-462c-a620-1b59e57c1eef-serviceca\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689233 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-ovnkube-config\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689339 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-run-ovn\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689339 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-daemon-config\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.689415 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689383 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75b35535-d264-462c-a620-1b59e57c1eef-host\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.689848 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689457 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-sys-fs\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.689848 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689498 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzmjv\" (UniqueName: \"kubernetes.io/projected/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-kube-api-access-pzmjv\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.689848 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.689533 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690409 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ce7310b9-648a-4042-86fe-ef118fc7af4e-cni-binary-copy\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690690 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-cnibin\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-kubelet-config\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690765 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-dbus\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690800 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-slash\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690830 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-env-overrides\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690858 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-modprobe-d\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690879 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-run\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690905 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-hostroot\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690935 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-cnibin\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690963 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.690991 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-registration-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691029 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-tmp\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-netns\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691079 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-etc-kubernetes\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8vm\" (UniqueName: \"kubernetes.io/projected/75b35535-d264-462c-a620-1b59e57c1eef-kube-api-access-fs8vm\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.694499 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691141 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vphqj\" (UniqueName: \"kubernetes.io/projected/2ea7fcad-19ae-42ab-8026-113afe4c2f23-kube-api-access-vphqj\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691170 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-var-lib-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691198 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-systemd\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691224 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-system-cni-dir\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691309 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/854b67d6-4dbb-4558-9444-235eb53b9278-hosts-file\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691344 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-systemd-units\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691375 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce6edf92-e02a-474f-864a-5bd1153d24d6-konnectivity-ca\") pod \"konnectivity-agent-st4n8\" (UID: \"ce6edf92-e02a-474f-864a-5bd1153d24d6\") " pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysctl-conf\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691432 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-lib-modules\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691458 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-node-log\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691483 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-log-socket\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-ovnkube-script-lib\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691553 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbvws\" (UniqueName: \"kubernetes.io/projected/3b758c69-286f-4851-8bab-2922e791af32-kube-api-access-dbvws\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691584 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691604 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-host\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691632 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-k8s-cni-cncf-io\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.695348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691669 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-cni-binary-copy\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691695 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce6edf92-e02a-474f-864a-5bd1153d24d6-agent-certs\") pod \"konnectivity-agent-st4n8\" (UID: \"ce6edf92-e02a-474f-864a-5bd1153d24d6\") " pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691702 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-cni-netd\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691735 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6b7z\" (UniqueName: \"kubernetes.io/projected/9d5e6927-1286-418f-9057-4d85494090c2-kube-api-access-r6b7z\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691784 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-kubernetes\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691812 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-sys\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691842 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-cni-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691874 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-cni-multus\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691907 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-multus-certs\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691944 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ab1f86f-6203-4036-b656-aeacb3b958ba-host-slash\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.691990 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-kubelet\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692022 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-cni-bin\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692051 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-cnibin\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692108 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-kubernetes\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692226 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-var-lib-openvswitch\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692242 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-systemd\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692294 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-slash\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692415 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-systemd-units\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.696082 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692657 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-env-overrides\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-modprobe-d\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692856 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-run\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692897 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-hostroot\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693076 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce6edf92-e02a-474f-864a-5bd1153d24d6-konnectivity-ca\") pod \"konnectivity-agent-st4n8\" (UID: \"ce6edf92-e02a-474f-864a-5bd1153d24d6\") " pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693103 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-kubelet\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693358 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693476 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysctl-conf\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693518 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-netns\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693559 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-lib-modules\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.693599 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693625 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-node-log\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693673 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-log-socket\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.693681 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:10.193652631 +0000 UTC m=+3.069045339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.693947 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-host\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694051 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-k8s-cni-cncf-io\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694193 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b758c69-286f-4851-8bab-2922e791af32-ovnkube-script-lib\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.692054 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-kubelet\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694247 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-kubelet\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694252 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-socket-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694311 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-etc-kubernetes\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694326 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-multus-cni-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694347 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-var-lib-cni-multus\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694378 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-registration-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694376 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-sys\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694401 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysconfig\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694422 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-cni-bin\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694434 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-system-cni-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694476 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9d5e6927-1286-418f-9057-4d85494090c2-socket-dir\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694611 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-system-cni-dir\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694622 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-tuned\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694678 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7hc\" (UniqueName: \"kubernetes.io/projected/ce7310b9-648a-4042-86fe-ef118fc7af4e-kube-api-access-pt7hc\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694688 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ce7310b9-648a-4042-86fe-ef118fc7af4e-host-run-multus-certs\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694717 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/854b67d6-4dbb-4558-9444-235eb53b9278-tmp-dir\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.694736 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b758c69-286f-4851-8bab-2922e791af32-host-cni-netd\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.694771 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.694791 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:09.697995 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.694807 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:09.698882 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.694881 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:10:10.194863815 +0000 UTC m=+3.070256514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:09.698882 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.695034 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-etc-sysconfig\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.698882 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.698609 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-tmp\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.698882 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.698682 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzmjv\" (UniqueName: \"kubernetes.io/projected/37d75c3f-00cd-47b8-80a8-e9ed9af8a917-kube-api-access-pzmjv\") pod \"tuned-7tsjs\" (UID: \"37d75c3f-00cd-47b8-80a8-e9ed9af8a917\") " pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.699249 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.699184 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b758c69-286f-4851-8bab-2922e791af32-ovn-node-metrics-cert\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.700641 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.700596 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6b7z\" (UniqueName: \"kubernetes.io/projected/9d5e6927-1286-418f-9057-4d85494090c2-kube-api-access-r6b7z\") pod \"aws-ebs-csi-driver-node-kq92d\" (UID: \"9d5e6927-1286-418f-9057-4d85494090c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.700874 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.700837 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbvws\" (UniqueName: \"kubernetes.io/projected/3b758c69-286f-4851-8bab-2922e791af32-kube-api-access-dbvws\") pod \"ovnkube-node-ql5pl\" (UID: \"3b758c69-286f-4851-8bab-2922e791af32\") " pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.701748 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.701514 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vphqj\" (UniqueName: \"kubernetes.io/projected/2ea7fcad-19ae-42ab-8026-113afe4c2f23-kube-api-access-vphqj\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:09.704098 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.704080 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8vm\" (UniqueName: \"kubernetes.io/projected/75b35535-d264-462c-a620-1b59e57c1eef-kube-api-access-fs8vm\") pod \"node-ca-8pwg5\" (UID: \"75b35535-d264-462c-a620-1b59e57c1eef\") " pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.705189 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.705171 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7hc\" (UniqueName: \"kubernetes.io/projected/ce7310b9-648a-4042-86fe-ef118fc7af4e-kube-api-access-pt7hc\") pod \"multus-9nr2v\" (UID: \"ce7310b9-648a-4042-86fe-ef118fc7af4e\") " pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.708351 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.708260 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" event={"ID":"652ed875b090027ea0cb6468dfc50153","Type":"ContainerStarted","Data":"fb6c6f7e19721bf6fed04e673f06f5386e99f2f12c9578dd92b2d457ee14600f"} Apr 17 18:10:09.709296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.709242 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" event={"ID":"acff7d4a666780b2ff38ca9f5fa48a1a","Type":"ContainerStarted","Data":"bb873af36a9834850a1c21017bbab9d9af32ba6c6e307d325348ec7f32fe9bb7"} Apr 17 18:10:09.795613 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795519 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-system-cni-dir\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.795613 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795567 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.795825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795644 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-system-cni-dir\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.795825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795700 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/854b67d6-4dbb-4558-9444-235eb53b9278-hosts-file\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.795825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-cni-binary-copy\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.795825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795763 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/854b67d6-4dbb-4558-9444-235eb53b9278-hosts-file\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.795825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795786 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ab1f86f-6203-4036-b656-aeacb3b958ba-host-slash\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.795825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795817 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/854b67d6-4dbb-4558-9444-235eb53b9278-tmp-dir\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.796146 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795842 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-os-release\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796146 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795847 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1ab1f86f-6203-4036-b656-aeacb3b958ba-host-slash\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.796146 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795865 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6vtd\" (UniqueName: \"kubernetes.io/projected/854b67d6-4dbb-4558-9444-235eb53b9278-kube-api-access-l6vtd\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.796146 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.795944 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-os-release\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796144 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796155 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1ab1f86f-6203-4036-b656-aeacb3b958ba-iptables-alerter-script\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796199 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzqn\" (UniqueName: \"kubernetes.io/projected/1ab1f86f-6203-4036-b656-aeacb3b958ba-kube-api-access-lrzqn\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796215 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/854b67d6-4dbb-4558-9444-235eb53b9278-tmp-dir\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796249 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-cni-binary-copy\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796295 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796324 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flbnv\" (UniqueName: \"kubernetes.io/projected/b5e8e12b-b8af-4a75-8a47-fc6090430623-kube-api-access-flbnv\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.796351 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:09.796360 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796366 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-kubelet-config\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796383 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-dbus\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:09.796405 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:10.296387879 +0000 UTC m=+3.171780567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796440 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-cnibin\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796508 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-dbus\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796520 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-cnibin\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796545 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-kubelet-config\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796646 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1ab1f86f-6203-4036-b656-aeacb3b958ba-iptables-alerter-script\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.796811 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.796655 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b5e8e12b-b8af-4a75-8a47-fc6090430623-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.797531 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.797513 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b5e8e12b-b8af-4a75-8a47-fc6090430623-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.808125 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.808100 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzqn\" (UniqueName: \"kubernetes.io/projected/1ab1f86f-6203-4036-b656-aeacb3b958ba-kube-api-access-lrzqn\") pod \"iptables-alerter-45lrm\" (UID: \"1ab1f86f-6203-4036-b656-aeacb3b958ba\") " pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.808125 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.808120 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbnv\" (UniqueName: \"kubernetes.io/projected/b5e8e12b-b8af-4a75-8a47-fc6090430623-kube-api-access-flbnv\") pod \"multus-additional-cni-plugins-sj2s9\" (UID: \"b5e8e12b-b8af-4a75-8a47-fc6090430623\") " pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.808452 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.808435 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6vtd\" (UniqueName: \"kubernetes.io/projected/854b67d6-4dbb-4558-9444-235eb53b9278-kube-api-access-l6vtd\") pod \"node-resolver-rl4d7\" (UID: \"854b67d6-4dbb-4558-9444-235eb53b9278\") " pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:09.873128 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.873093 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" Apr 17 18:10:09.881986 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.881956 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8pwg5" Apr 17 18:10:09.892762 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.892727 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:09.898685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.898647 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:09.906888 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.906645 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" Apr 17 18:10:09.914552 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.914528 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9nr2v" Apr 17 18:10:09.927055 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.927033 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" Apr 17 18:10:09.934746 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.934722 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-45lrm" Apr 17 18:10:09.940320 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:09.940301 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rl4d7" Apr 17 18:10:10.199455 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.199356 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:10.199602 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.199452 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:10.199602 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.199538 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:10.199691 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.199598 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:10.199691 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.199619 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:10.199691 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.199627 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:11.199609367 +0000 UTC m=+4.075002259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:10.199691 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.199629 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:10.199691 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.199684 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:10:11.199668248 +0000 UTC m=+4.075060939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:10.300184 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.300149 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:10.300374 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.300332 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:10.300434 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:10.300416 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:11.300398745 +0000 UTC m=+4.175791441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:10.444993 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.444963 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab1f86f_6203_4036_b656_aeacb3b958ba.slice/crio-1fa9f209f78accf720986f7ccac6be376b4e4ec615c69b83c9d9f3bf846da889 WatchSource:0}: Error finding container 1fa9f209f78accf720986f7ccac6be376b4e4ec615c69b83c9d9f3bf846da889: Status 404 returned error can't find the container with id 1fa9f209f78accf720986f7ccac6be376b4e4ec615c69b83c9d9f3bf846da889 Apr 17 18:10:10.445858 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.445838 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854b67d6_4dbb_4558_9444_235eb53b9278.slice/crio-95e39a56dafdb527060b3b5801edd54fc23d579e649f5ab98ef176fc73dc9607 WatchSource:0}: Error finding container 95e39a56dafdb527060b3b5801edd54fc23d579e649f5ab98ef176fc73dc9607: Status 404 returned error can't find the container with id 95e39a56dafdb527060b3b5801edd54fc23d579e649f5ab98ef176fc73dc9607 Apr 17 18:10:10.447250 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.447209 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce6edf92_e02a_474f_864a_5bd1153d24d6.slice/crio-e6641796e5434e32ebe881f86fa96fd7505da5d4fe72932faf8885f52600d49c WatchSource:0}: Error finding container e6641796e5434e32ebe881f86fa96fd7505da5d4fe72932faf8885f52600d49c: Status 404 returned error can't find the container with id e6641796e5434e32ebe881f86fa96fd7505da5d4fe72932faf8885f52600d49c Apr 17 18:10:10.448056 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.448030 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37d75c3f_00cd_47b8_80a8_e9ed9af8a917.slice/crio-cbdcc8ddc6b11f4b365fc39af6e8c5cb39c8a662f516f919a6921b772382d907 WatchSource:0}: Error finding container cbdcc8ddc6b11f4b365fc39af6e8c5cb39c8a662f516f919a6921b772382d907: Status 404 returned error can't find the container with id cbdcc8ddc6b11f4b365fc39af6e8c5cb39c8a662f516f919a6921b772382d907 Apr 17 18:10:10.450887 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.450864 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b758c69_286f_4851_8bab_2922e791af32.slice/crio-1e9c998d6ac4ab994f7ba13e2ebd29e9ea71422edb3043d29dc3dfab2dd233ef WatchSource:0}: Error finding container 1e9c998d6ac4ab994f7ba13e2ebd29e9ea71422edb3043d29dc3dfab2dd233ef: Status 404 returned error can't find the container with id 1e9c998d6ac4ab994f7ba13e2ebd29e9ea71422edb3043d29dc3dfab2dd233ef Apr 17 18:10:10.451836 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.451814 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d5e6927_1286_418f_9057_4d85494090c2.slice/crio-afa66a5460b66ffd8c3c10db06d8ad4a03626265541511fbbdeca9156ae217f0 WatchSource:0}: Error finding container afa66a5460b66ffd8c3c10db06d8ad4a03626265541511fbbdeca9156ae217f0: Status 404 returned error can't find the container with id afa66a5460b66ffd8c3c10db06d8ad4a03626265541511fbbdeca9156ae217f0 Apr 17 18:10:10.453667 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.453574 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e8e12b_b8af_4a75_8a47_fc6090430623.slice/crio-57ac08b47a22bdeccea0b0eb9d26f9ad6473e4c6a170c1029938dad0aa9f6aad WatchSource:0}: Error finding container 57ac08b47a22bdeccea0b0eb9d26f9ad6473e4c6a170c1029938dad0aa9f6aad: Status 404 returned error can't find the container with id 57ac08b47a22bdeccea0b0eb9d26f9ad6473e4c6a170c1029938dad0aa9f6aad Apr 17 18:10:10.454056 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:10:10.453914 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75b35535_d264_462c_a620_1b59e57c1eef.slice/crio-e9ae99b5568e44825ef58e599132e2cc285e812482119140a95c08ee8addd354 WatchSource:0}: Error finding container e9ae99b5568e44825ef58e599132e2cc285e812482119140a95c08ee8addd354: Status 404 returned error can't find the container with id e9ae99b5568e44825ef58e599132e2cc285e812482119140a95c08ee8addd354 Apr 17 18:10:10.618401 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.618356 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:05:08 +0000 UTC" deadline="2027-10-12 00:31:12.765924387 +0000 UTC" Apr 17 18:10:10.618401 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.618398 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13014h21m2.14752978s" Apr 17 18:10:10.712029 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.711941 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8pwg5" event={"ID":"75b35535-d264-462c-a620-1b59e57c1eef","Type":"ContainerStarted","Data":"e9ae99b5568e44825ef58e599132e2cc285e812482119140a95c08ee8addd354"} Apr 17 18:10:10.712987 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.712967 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"1e9c998d6ac4ab994f7ba13e2ebd29e9ea71422edb3043d29dc3dfab2dd233ef"} Apr 17 18:10:10.713894 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.713862 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" event={"ID":"37d75c3f-00cd-47b8-80a8-e9ed9af8a917","Type":"ContainerStarted","Data":"cbdcc8ddc6b11f4b365fc39af6e8c5cb39c8a662f516f919a6921b772382d907"} Apr 17 18:10:10.717724 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.715317 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" event={"ID":"652ed875b090027ea0cb6468dfc50153","Type":"ContainerStarted","Data":"a82713a4217e3061b0d6408dc18ff80bd7ab2c5a9936b0e03b597b509e725fb2"} Apr 17 18:10:10.721684 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.719854 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerStarted","Data":"57ac08b47a22bdeccea0b0eb9d26f9ad6473e4c6a170c1029938dad0aa9f6aad"} Apr 17 18:10:10.721684 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.721368 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" event={"ID":"9d5e6927-1286-418f-9057-4d85494090c2","Type":"ContainerStarted","Data":"afa66a5460b66ffd8c3c10db06d8ad4a03626265541511fbbdeca9156ae217f0"} Apr 17 18:10:10.723741 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.723706 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-st4n8" event={"ID":"ce6edf92-e02a-474f-864a-5bd1153d24d6","Type":"ContainerStarted","Data":"e6641796e5434e32ebe881f86fa96fd7505da5d4fe72932faf8885f52600d49c"} Apr 17 18:10:10.725165 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.725134 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rl4d7" event={"ID":"854b67d6-4dbb-4558-9444-235eb53b9278","Type":"ContainerStarted","Data":"95e39a56dafdb527060b3b5801edd54fc23d579e649f5ab98ef176fc73dc9607"} Apr 17 18:10:10.726113 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.726093 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-45lrm" event={"ID":"1ab1f86f-6203-4036-b656-aeacb3b958ba","Type":"ContainerStarted","Data":"1fa9f209f78accf720986f7ccac6be376b4e4ec615c69b83c9d9f3bf846da889"} Apr 17 18:10:10.728965 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.728945 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9nr2v" event={"ID":"ce7310b9-648a-4042-86fe-ef118fc7af4e","Type":"ContainerStarted","Data":"d5dd00e83365c2a5c0457b21249d03beaedcbe2670fbda5386b31343ec796df2"} Apr 17 18:10:10.729624 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:10.729592 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-142.ec2.internal" podStartSLOduration=2.729582685 podStartE2EDuration="2.729582685s" podCreationTimestamp="2026-04-17 18:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:10:10.729190403 +0000 UTC m=+3.604583113" watchObservedRunningTime="2026-04-17 18:10:10.729582685 +0000 UTC m=+3.604975394" Apr 17 18:10:11.208904 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:11.208805 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:11.209060 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:11.208905 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:11.209060 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.209052 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:11.209171 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.209073 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:11.209171 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.209087 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:11.209171 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.209147 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:10:13.209129619 +0000 UTC m=+6.084522309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:11.209616 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.209596 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:11.209698 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.209651 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:13.209636555 +0000 UTC m=+6.085029247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:11.309888 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:11.309839 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:11.310051 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.310000 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:11.310129 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.310060 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:13.310042662 +0000 UTC m=+6.185435361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:11.705309 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:11.705196 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:11.705709 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.705349 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:11.705847 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:11.705812 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:11.705942 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.705918 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:11.706022 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:11.706009 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:11.706119 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:11.706094 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:12.742882 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:12.742754 2583 generic.go:358] "Generic (PLEG): container finished" podID="acff7d4a666780b2ff38ca9f5fa48a1a" containerID="3100f9991aa7a424294e4f8ef4c43f6130731ef285c2bf1c305ff39d6c1fc7bf" exitCode=0 Apr 17 18:10:12.742882 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:12.742804 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" event={"ID":"acff7d4a666780b2ff38ca9f5fa48a1a","Type":"ContainerDied","Data":"3100f9991aa7a424294e4f8ef4c43f6130731ef285c2bf1c305ff39d6c1fc7bf"} Apr 17 18:10:13.225162 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:13.225055 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:13.225162 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:13.225116 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:13.225433 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.225249 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:13.225433 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.225398 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:13.225433 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.225418 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:13.225433 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.225431 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:13.225615 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.225488 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:10:17.225466862 +0000 UTC m=+10.100859565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:13.225942 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.225911 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:17.225892565 +0000 UTC m=+10.101285256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:13.325760 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:13.325717 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:13.325946 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.325935 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:13.326010 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.326001 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:17.325982644 +0000 UTC m=+10.201375339 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:13.704166 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:13.704079 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:13.704359 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.704230 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:13.704781 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:13.704717 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:13.704903 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.704819 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:13.705028 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:13.704978 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:13.705136 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:13.705110 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:15.703387 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:15.703352 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:15.703764 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:15.703353 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:15.703764 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:15.703475 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:15.703764 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:15.703578 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:15.703875 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:15.703788 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:15.703875 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:15.703842 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:17.259930 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:17.259885 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:17.260353 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:17.259954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:17.260353 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.260082 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:17.260353 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.260146 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:25.260127324 +0000 UTC m=+18.135520016 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:17.260586 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.260570 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:17.260626 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.260592 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:17.260626 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.260605 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:17.260682 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.260650 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:10:25.260633678 +0000 UTC m=+18.136026372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:17.361035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:17.360997 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:17.361229 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.361194 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:17.361380 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.361263 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:25.361244137 +0000 UTC m=+18.236636828 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:17.704547 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:17.704464 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:17.704709 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.704590 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:17.704993 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:17.704975 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:17.705091 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.705072 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:17.705234 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:17.705214 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:17.705345 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:17.705326 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:19.704126 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:19.704089 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:19.704579 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:19.704203 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:19.704579 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:19.704089 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:19.704579 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:19.704268 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:19.704579 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:19.704089 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:19.704579 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:19.704366 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:21.704238 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:21.704194 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:21.704680 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:21.704194 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:21.704680 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:21.704353 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:21.704680 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:21.704194 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:21.704680 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:21.704409 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:21.704680 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:21.704489 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:23.703929 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:23.703842 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:23.704380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:23.703842 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:23.704380 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:23.703980 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:23.704380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:23.703842 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:23.704380 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:23.704144 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:23.704380 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:23.704041 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:25.323350 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:25.323298 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:25.323377 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.323400 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.323419 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.323432 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.323481 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.323522 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:10:41.323507705 +0000 UTC m=+34.198900392 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:25.323928 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.323550 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:41.323530784 +0000 UTC m=+34.198923481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:25.424577 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:25.424536 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:25.424746 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.424676 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:25.424746 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.424738 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:41.424722754 +0000 UTC m=+34.300115443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:25.704505 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:25.703977 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:25.704505 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:25.704003 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:25.704505 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:25.704099 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:25.704505 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.704107 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:25.704505 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.704202 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:25.704505 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:25.704268 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:27.704438 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:27.704405 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:27.704808 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:27.704488 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:27.704808 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:27.704585 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:27.704808 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:27.704695 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:27.704808 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:27.704743 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:27.704951 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:27.704898 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:28.773767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.773379 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"29940def4461c4c3045154264ead6d19295db094debfd813dab0ceaf32bdac23"} Apr 17 18:10:28.773767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.773645 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"4adb23f15a953353eb49b0989c74837b91a4775b5eb45b610aa05d4594464d67"} Apr 17 18:10:28.773767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.773663 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"496e430b6de2474735efb9bf3e06be526eaef5ffc0fee56055ca3d9280cb8447"} Apr 17 18:10:28.773767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.773678 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"7976b8add269d6928dd6503e529aefb3d9381755be90014aec3314893979de1c"} Apr 17 18:10:28.773767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.773691 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"4ddbef605fb0f716afddddc11bb8763865174f9e4f7325611cabc14c01dc476f"} Apr 17 18:10:28.773767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.773703 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"465882e0b30372588417277b9b835d72e2c819be443827f96df45f6d5a85babc"} Apr 17 18:10:28.774976 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.774953 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" event={"ID":"37d75c3f-00cd-47b8-80a8-e9ed9af8a917","Type":"ContainerStarted","Data":"b5706fb69a802962b3b481344748b1b0adcb6bc58f00b45182a594e643cc2bba"} Apr 17 18:10:28.776251 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.776229 2583 generic.go:358] "Generic (PLEG): container finished" podID="b5e8e12b-b8af-4a75-8a47-fc6090430623" containerID="60b8b3495b69f4c5703e41e43e828ba520bb6bc86021a7bb0ca08680e3041713" exitCode=0 Apr 17 18:10:28.776353 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.776313 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerDied","Data":"60b8b3495b69f4c5703e41e43e828ba520bb6bc86021a7bb0ca08680e3041713"} Apr 17 18:10:28.778047 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.778025 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" event={"ID":"9d5e6927-1286-418f-9057-4d85494090c2","Type":"ContainerStarted","Data":"2a1d20c8f312c21289f82ed7efc378848abf04f6210d65a95ca659429d4c5663"} Apr 17 18:10:28.780149 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.780117 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-st4n8" event={"ID":"ce6edf92-e02a-474f-864a-5bd1153d24d6","Type":"ContainerStarted","Data":"ad3aa5066999974fdf6feb769ec5c5fdd3ed0fc62109faa1b1d926e76f3e7c13"} Apr 17 18:10:28.781479 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.781458 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rl4d7" event={"ID":"854b67d6-4dbb-4558-9444-235eb53b9278","Type":"ContainerStarted","Data":"c09adf4663846fd255a3e9671a3f83b5bbf9a93ff2c87983cd3e312f38cc865a"} Apr 17 18:10:28.783161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.783140 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" event={"ID":"acff7d4a666780b2ff38ca9f5fa48a1a","Type":"ContainerStarted","Data":"395ed936fafdf529a61702e520f54e4be62b31fc1c7a250b5bae1d53091c1e7f"} Apr 17 18:10:28.784544 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.784509 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9nr2v" event={"ID":"ce7310b9-648a-4042-86fe-ef118fc7af4e","Type":"ContainerStarted","Data":"5b342d7745513bb4fd6b215352fa6e2f8a5b3c54d6080b9d962bf0d2121b60a1"} Apr 17 18:10:28.785767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.785745 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8pwg5" event={"ID":"75b35535-d264-462c-a620-1b59e57c1eef","Type":"ContainerStarted","Data":"0f833e461d7a808c49aba2a49e7d66a205f267834e9798e09817f53d6cbdb26b"} Apr 17 18:10:28.791966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.791929 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7tsjs" podStartSLOduration=4.378519396 podStartE2EDuration="21.791915272s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.450967279 +0000 UTC m=+3.326359967" lastFinishedPulling="2026-04-17 18:10:27.864363149 +0000 UTC m=+20.739755843" observedRunningTime="2026-04-17 18:10:28.791401372 +0000 UTC m=+21.666794083" watchObservedRunningTime="2026-04-17 18:10:28.791915272 +0000 UTC m=+21.667307987" Apr 17 18:10:28.807806 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.807746 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-142.ec2.internal" podStartSLOduration=20.807728181 podStartE2EDuration="20.807728181s" podCreationTimestamp="2026-04-17 18:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:10:28.806572554 +0000 UTC m=+21.681965262" watchObservedRunningTime="2026-04-17 18:10:28.807728181 +0000 UTC m=+21.683120891" Apr 17 18:10:28.820529 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.820469 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8pwg5" podStartSLOduration=4.388950748 podStartE2EDuration="21.820451075s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.456078993 +0000 UTC m=+3.331471695" lastFinishedPulling="2026-04-17 18:10:27.88757933 +0000 UTC m=+20.762972022" observedRunningTime="2026-04-17 18:10:28.820030726 +0000 UTC m=+21.695423427" watchObservedRunningTime="2026-04-17 18:10:28.820451075 +0000 UTC m=+21.695843786" Apr 17 18:10:28.834918 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.834868 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-st4n8" podStartSLOduration=4.420111931 podStartE2EDuration="21.834852841s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.449619737 +0000 UTC m=+3.325012429" lastFinishedPulling="2026-04-17 18:10:27.864360648 +0000 UTC m=+20.739753339" observedRunningTime="2026-04-17 18:10:28.83430338 +0000 UTC m=+21.709696092" watchObservedRunningTime="2026-04-17 18:10:28.834852841 +0000 UTC m=+21.710245552" Apr 17 18:10:28.851526 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.851466 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9nr2v" podStartSLOduration=4.378613183 podStartE2EDuration="21.851449682s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.456503285 +0000 UTC m=+3.331895975" lastFinishedPulling="2026-04-17 18:10:27.929339777 +0000 UTC m=+20.804732474" observedRunningTime="2026-04-17 18:10:28.850837956 +0000 UTC m=+21.726230666" watchObservedRunningTime="2026-04-17 18:10:28.851449682 +0000 UTC m=+21.726842395" Apr 17 18:10:28.889140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:28.889092 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rl4d7" podStartSLOduration=4.472144241 podStartE2EDuration="21.889075168s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.447489785 +0000 UTC m=+3.322882474" lastFinishedPulling="2026-04-17 18:10:27.864420712 +0000 UTC m=+20.739813401" observedRunningTime="2026-04-17 18:10:28.889044495 +0000 UTC m=+21.764437205" watchObservedRunningTime="2026-04-17 18:10:28.889075168 +0000 UTC m=+21.764467867" Apr 17 18:10:29.049006 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.048979 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 18:10:29.263542 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.263498 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:29.264136 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.264112 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:29.650054 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.649963 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T18:10:29.049000391Z","UUID":"a5d27089-c924-4177-a80c-8b84c1a9bf87","Handler":null,"Name":"","Endpoint":""} Apr 17 18:10:29.652018 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.651995 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 18:10:29.652150 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.652027 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 18:10:29.704313 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.704224 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:29.704532 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:29.704360 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:29.704726 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.704706 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:29.704802 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:29.704783 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:29.704860 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.704833 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:29.704912 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:29.704898 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:29.790257 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.790219 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" event={"ID":"9d5e6927-1286-418f-9057-4d85494090c2","Type":"ContainerStarted","Data":"0b4643844b6bb4c54a8249abb28a2ef3832e786511ce97e1c95b90f73d2254ff"} Apr 17 18:10:29.792431 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.791963 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-45lrm" event={"ID":"1ab1f86f-6203-4036-b656-aeacb3b958ba","Type":"ContainerStarted","Data":"d8e0e8671b0d2e5ff3da8c453067f22648cc1c377c4f6ef04e85a73096be4186"} Apr 17 18:10:29.792717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.792698 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:29.793367 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.793241 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-st4n8" Apr 17 18:10:29.807065 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:29.807003 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-45lrm" podStartSLOduration=5.365187272 podStartE2EDuration="22.806985715s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.446709435 +0000 UTC m=+3.322102138" lastFinishedPulling="2026-04-17 18:10:27.888507877 +0000 UTC m=+20.763900581" observedRunningTime="2026-04-17 18:10:29.806157779 +0000 UTC m=+22.681550490" watchObservedRunningTime="2026-04-17 18:10:29.806985715 +0000 UTC m=+22.682378424" Apr 17 18:10:30.796743 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:30.796483 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"a9c582783a3f35a83ebf76255de2f90997301146d0f0fe66c6c9866fae3f9916"} Apr 17 18:10:30.798529 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:30.798492 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" event={"ID":"9d5e6927-1286-418f-9057-4d85494090c2","Type":"ContainerStarted","Data":"12c85a819d90a6d30267e2efb392b66fcdc8b3bb2c17ce033c3222a76a1fde81"} Apr 17 18:10:30.824158 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:30.824106 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq92d" podStartSLOduration=4.404713053 podStartE2EDuration="23.824089174s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.454988739 +0000 UTC m=+3.330381428" lastFinishedPulling="2026-04-17 18:10:29.874364845 +0000 UTC m=+22.749757549" observedRunningTime="2026-04-17 18:10:30.823847954 +0000 UTC m=+23.699240663" watchObservedRunningTime="2026-04-17 18:10:30.824089174 +0000 UTC m=+23.699481884" Apr 17 18:10:31.703765 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:31.703727 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:31.703979 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:31.703766 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:31.703979 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:31.703727 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:31.703979 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:31.703861 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:31.703979 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:31.703940 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:31.704166 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:31.704052 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:33.704344 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.704137 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:33.704999 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.704138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:33.704999 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:33.704422 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:33.704999 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:33.704538 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:33.704999 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.704138 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:33.704999 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:33.704638 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:33.807108 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.807069 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" event={"ID":"3b758c69-286f-4851-8bab-2922e791af32","Type":"ContainerStarted","Data":"020846e1bdfbe6fc8697838273ea707f765c3d324a512568a74f2f209349b3c7"} Apr 17 18:10:33.807414 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.807391 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:33.808834 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.808804 2583 generic.go:358] "Generic (PLEG): container finished" podID="b5e8e12b-b8af-4a75-8a47-fc6090430623" containerID="0e23c8da1a54e0899bec574d09bc46fa42e7711d0885925283a5b2d3b9305741" exitCode=0 Apr 17 18:10:33.808942 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.808859 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerDied","Data":"0e23c8da1a54e0899bec574d09bc46fa42e7711d0885925283a5b2d3b9305741"} Apr 17 18:10:33.823841 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.823815 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:33.840446 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:33.840396 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" podStartSLOduration=9.333370263 podStartE2EDuration="26.840382794s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.452829701 +0000 UTC m=+3.328222388" lastFinishedPulling="2026-04-17 18:10:27.959842219 +0000 UTC m=+20.835234919" observedRunningTime="2026-04-17 18:10:33.840168205 +0000 UTC m=+26.715560916" watchObservedRunningTime="2026-04-17 18:10:33.840382794 +0000 UTC m=+26.715775504" Apr 17 18:10:34.811818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:34.811788 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:34.812158 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:34.811832 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:34.835925 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:34.835900 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:10:35.146289 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.146086 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pqbjx"] Apr 17 18:10:35.146456 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.146405 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:35.146536 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:35.146518 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:35.146725 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.146707 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kgjhh"] Apr 17 18:10:35.146818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.146808 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:35.146924 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:35.146890 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:35.151044 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.148836 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6d44x"] Apr 17 18:10:35.151044 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.148973 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:35.151044 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:35.149104 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:35.814823 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.814787 2583 generic.go:358] "Generic (PLEG): container finished" podID="b5e8e12b-b8af-4a75-8a47-fc6090430623" containerID="3f98cd8e1a91545e7a9b2f4f34f11d2e1a8660be7a7b200f4f74eeffe5b96357" exitCode=0 Apr 17 18:10:35.815203 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:35.814867 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerDied","Data":"3f98cd8e1a91545e7a9b2f4f34f11d2e1a8660be7a7b200f4f74eeffe5b96357"} Apr 17 18:10:36.703732 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:36.703693 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:36.703892 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:36.703693 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:36.703892 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:36.703820 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:36.703965 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:36.703907 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:36.703965 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:36.703698 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:36.704034 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:36.704017 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:37.819948 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:37.819911 2583 generic.go:358] "Generic (PLEG): container finished" podID="b5e8e12b-b8af-4a75-8a47-fc6090430623" containerID="3e5afc4781079740ef058583507897d1ede318ae0fe7dc3c24a06e57c0b4cbe0" exitCode=0 Apr 17 18:10:37.820629 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:37.819955 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerDied","Data":"3e5afc4781079740ef058583507897d1ede318ae0fe7dc3c24a06e57c0b4cbe0"} Apr 17 18:10:38.704047 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:38.703999 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:38.704187 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:38.704069 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:38.704187 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:38.704165 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:38.704289 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:38.704196 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:38.704289 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:38.704249 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:38.704396 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:38.704360 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:40.704153 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.704119 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:40.704645 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.704244 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:40.704645 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.704289 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:40.704645 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:40.704245 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqbjx" podUID="15a8ea75-9ba5-4f38-9236-0c6de8eae8e9" Apr 17 18:10:40.704645 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:40.704374 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kgjhh" podUID="d2df8bcd-2956-4041-abb8-966ec57fce1d" Apr 17 18:10:40.704645 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:40.704485 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:10:40.922865 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.922789 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-142.ec2.internal" event="NodeReady" Apr 17 18:10:40.923028 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.922964 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 18:10:40.965393 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.965356 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xqpjb"] Apr 17 18:10:40.982941 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.982908 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6f76l"] Apr 17 18:10:40.983130 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.983055 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:40.985913 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.985887 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 18:10:40.985913 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.985905 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 18:10:40.986127 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:40.986113 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-46dtx\"" Apr 17 18:10:41.001081 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.001055 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xqpjb"] Apr 17 18:10:41.001221 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.001089 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6f76l"] Apr 17 18:10:41.001221 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.001210 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.003725 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.003697 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 18:10:41.003833 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.003741 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hwgzd\"" Apr 17 18:10:41.004039 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.003999 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 18:10:41.004106 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.004054 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 18:10:41.148876 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.148839 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-tmp-dir\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.149057 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.148892 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vptt8\" (UniqueName: \"kubernetes.io/projected/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-kube-api-access-vptt8\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.149057 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.148946 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.149057 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.149002 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdh8t\" (UniqueName: \"kubernetes.io/projected/1f9df124-3418-493f-8f5e-bd5ea9df2004-kube-api-access-pdh8t\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.149214 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.149072 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.149214 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.149138 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-config-volume\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.249876 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.249778 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-tmp-dir\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.249876 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.249839 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vptt8\" (UniqueName: \"kubernetes.io/projected/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-kube-api-access-vptt8\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.250110 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.249920 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.250110 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.249951 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdh8t\" (UniqueName: \"kubernetes.io/projected/1f9df124-3418-493f-8f5e-bd5ea9df2004-kube-api-access-pdh8t\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.250110 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.249988 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.250110 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.250016 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-config-volume\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.250110 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.250047 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:41.250393 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.250123 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:41.250393 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.250126 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:41.750104726 +0000 UTC m=+34.625497417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:10:41.250393 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.250210 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:10:41.750177887 +0000 UTC m=+34.625570575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:10:41.250393 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.250239 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-tmp-dir\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.250610 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.250592 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-config-volume\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.264322 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.264297 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vptt8\" (UniqueName: \"kubernetes.io/projected/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-kube-api-access-vptt8\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.264458 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.264392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdh8t\" (UniqueName: \"kubernetes.io/projected/1f9df124-3418-493f-8f5e-bd5ea9df2004-kube-api-access-pdh8t\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.351293 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.351241 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:41.351470 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.351333 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:41.351470 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.351421 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:41.351470 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.351451 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:41.351470 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.351465 2583 projected.go:194] Error preparing data for projected volume kube-api-access-7fcdh for pod openshift-network-diagnostics/network-check-target-kgjhh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:41.351648 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.351425 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:41.351648 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.351524 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh podName:d2df8bcd-2956-4041-abb8-966ec57fce1d nodeName:}" failed. No retries permitted until 2026-04-17 18:11:13.351502479 +0000 UTC m=+66.226895177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fcdh" (UniqueName: "kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh") pod "network-check-target-kgjhh" (UID: "d2df8bcd-2956-4041-abb8-966ec57fce1d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:41.351648 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.351565 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:13.35155016 +0000 UTC m=+66.226942851 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:41.452385 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.452346 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:41.452580 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.452519 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:41.452626 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.452599 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret podName:15a8ea75-9ba5-4f38-9236-0c6de8eae8e9 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:13.452580084 +0000 UTC m=+66.327972783 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret") pod "global-pull-secret-syncer-pqbjx" (UID: "15a8ea75-9ba5-4f38-9236-0c6de8eae8e9") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:41.755776 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.755735 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:41.756420 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:41.755808 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:41.756420 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.755902 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:41.756420 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.755970 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:41.756420 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.755990 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:42.755965843 +0000 UTC m=+35.631358535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:10:41.756420 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:41.756020 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:10:42.756003816 +0000 UTC m=+35.631396506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:10:42.704161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.704120 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:10:42.704369 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.704120 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:10:42.704369 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.704120 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:10:42.708085 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.707218 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:10:42.708085 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.707293 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:10:42.708680 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.708656 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:10:42.708875 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.708859 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-c85k4\"" Apr 17 18:10:42.709060 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.709041 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nnqt\"" Apr 17 18:10:42.709152 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.709134 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:10:42.764692 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.764659 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:42.765112 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:42.764716 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:42.765112 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:42.764827 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:42.765112 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:42.764840 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:42.765112 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:42.764885 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:10:44.764872691 +0000 UTC m=+37.640265379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:10:42.765112 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:42.764936 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:44.764918108 +0000 UTC m=+37.640310802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:10:44.781246 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:44.781007 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:44.781246 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:44.781212 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:44.781755 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:44.781158 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:44.781755 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:44.781335 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:44.781755 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:44.781356 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:48.781334504 +0000 UTC m=+41.656727192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:10:44.781755 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:44.781379 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:10:48.781368357 +0000 UTC m=+41.656761045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:10:44.835757 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:44.835723 2583 generic.go:358] "Generic (PLEG): container finished" podID="b5e8e12b-b8af-4a75-8a47-fc6090430623" containerID="bc7425ad47f03bf8a65b4e9af876da4ed5be84a336f4a9bba488e697eb893ceb" exitCode=0 Apr 17 18:10:44.835911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:44.835785 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerDied","Data":"bc7425ad47f03bf8a65b4e9af876da4ed5be84a336f4a9bba488e697eb893ceb"} Apr 17 18:10:45.840207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:45.840175 2583 generic.go:358] "Generic (PLEG): container finished" podID="b5e8e12b-b8af-4a75-8a47-fc6090430623" containerID="eb1f3bfb3a14ac10971e83d7b5100a8a0716d923e0b201c17af537b58d318628" exitCode=0 Apr 17 18:10:45.840612 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:45.840228 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerDied","Data":"eb1f3bfb3a14ac10971e83d7b5100a8a0716d923e0b201c17af537b58d318628"} Apr 17 18:10:46.845353 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:46.845314 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" event={"ID":"b5e8e12b-b8af-4a75-8a47-fc6090430623","Type":"ContainerStarted","Data":"4a6b0d47d5cde85cfa502bf969ce8093920c87369d155e30ee1750451c49183c"} Apr 17 18:10:46.867053 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:46.867004 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sj2s9" podStartSLOduration=6.231152726 podStartE2EDuration="39.866989446s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:10.455573429 +0000 UTC m=+3.330966116" lastFinishedPulling="2026-04-17 18:10:44.091410148 +0000 UTC m=+36.966802836" observedRunningTime="2026-04-17 18:10:46.865892426 +0000 UTC m=+39.741285136" watchObservedRunningTime="2026-04-17 18:10:46.866989446 +0000 UTC m=+39.742382156" Apr 17 18:10:48.812290 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:48.812227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:48.812718 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:48.812321 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:48.812718 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:48.812384 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:48.812718 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:48.812448 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:56.8124331 +0000 UTC m=+49.687825789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:10:48.812718 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:48.812459 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:48.812718 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:48.812507 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:10:56.812492892 +0000 UTC m=+49.687885581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:10:56.869692 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:56.869659 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:10:56.870195 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:10:56.869713 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:10:56.870195 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:56.869819 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:56.870195 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:56.869841 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:56.870195 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:56.869882 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:12.869865884 +0000 UTC m=+65.745258580 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:10:56.870195 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:10:56.869898 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:11:12.869891859 +0000 UTC m=+65.745284546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:11:06.827655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:06.827621 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ql5pl" Apr 17 18:11:12.885203 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:12.885153 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:11:12.885613 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:12.885220 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:11:12.885613 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:12.885333 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:11:12.885613 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:12.885369 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:11:12.885613 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:12.885395 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:44.885379383 +0000 UTC m=+97.760772070 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:11:12.885613 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:12.885425 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:11:44.885413359 +0000 UTC m=+97.760806047 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:11:13.389148 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.389110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:11:13.389372 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.389164 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:11:13.392078 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.392046 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:11:13.392078 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.392079 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:11:13.400245 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:13.400224 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:11:13.400315 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:13.400305 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:17.400287446 +0000 UTC m=+130.275680149 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : secret "metrics-daemon-secret" not found Apr 17 18:11:13.402197 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.402180 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:11:13.414052 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.414033 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fcdh\" (UniqueName: \"kubernetes.io/projected/d2df8bcd-2956-4041-abb8-966ec57fce1d-kube-api-access-7fcdh\") pod \"network-check-target-kgjhh\" (UID: \"d2df8bcd-2956-4041-abb8-966ec57fce1d\") " pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:11:13.490244 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.490205 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:11:13.492887 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.492864 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:11:13.503165 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.503140 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/15a8ea75-9ba5-4f38-9236-0c6de8eae8e9-original-pull-secret\") pod \"global-pull-secret-syncer-pqbjx\" (UID: \"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9\") " pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:11:13.619765 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.619716 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqbjx" Apr 17 18:11:13.630350 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.630323 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nnqt\"" Apr 17 18:11:13.638350 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.638324 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:11:13.755420 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.755390 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pqbjx"] Apr 17 18:11:13.759105 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:11:13.759075 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15a8ea75_9ba5_4f38_9236_0c6de8eae8e9.slice/crio-6070fd2475c6e006f50fe4fdf3a8b72803df7661ffee7f17eb35c0119c4ba7ee WatchSource:0}: Error finding container 6070fd2475c6e006f50fe4fdf3a8b72803df7661ffee7f17eb35c0119c4ba7ee: Status 404 returned error can't find the container with id 6070fd2475c6e006f50fe4fdf3a8b72803df7661ffee7f17eb35c0119c4ba7ee Apr 17 18:11:13.772429 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.772403 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kgjhh"] Apr 17 18:11:13.775717 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:11:13.775695 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2df8bcd_2956_4041_abb8_966ec57fce1d.slice/crio-bfa5dd6a0a4164946586be71498b0b3d254ef8304bea998c41be14a94c643f0e WatchSource:0}: Error finding container bfa5dd6a0a4164946586be71498b0b3d254ef8304bea998c41be14a94c643f0e: Status 404 returned error can't find the container with id bfa5dd6a0a4164946586be71498b0b3d254ef8304bea998c41be14a94c643f0e Apr 17 18:11:13.903987 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.903897 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pqbjx" event={"ID":"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9","Type":"ContainerStarted","Data":"6070fd2475c6e006f50fe4fdf3a8b72803df7661ffee7f17eb35c0119c4ba7ee"} Apr 17 18:11:13.904873 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:13.904851 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kgjhh" event={"ID":"d2df8bcd-2956-4041-abb8-966ec57fce1d","Type":"ContainerStarted","Data":"bfa5dd6a0a4164946586be71498b0b3d254ef8304bea998c41be14a94c643f0e"} Apr 17 18:11:18.915873 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:18.915832 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pqbjx" event={"ID":"15a8ea75-9ba5-4f38-9236-0c6de8eae8e9","Type":"ContainerStarted","Data":"293627f2b97df2bdd7611adadd65bdfce782a0a03ea998a2b9ed6aabdae0e54f"} Apr 17 18:11:18.917073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:18.917050 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kgjhh" event={"ID":"d2df8bcd-2956-4041-abb8-966ec57fce1d","Type":"ContainerStarted","Data":"de84ec2e5e296be02522ec6b1dc3f7741e5af691a39d4df8a105a1c392305d90"} Apr 17 18:11:18.917230 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:18.917216 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:11:18.929905 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:18.929857 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pqbjx" podStartSLOduration=66.276110367 podStartE2EDuration="1m10.929846148s" podCreationTimestamp="2026-04-17 18:10:08 +0000 UTC" firstStartedPulling="2026-04-17 18:11:13.760747323 +0000 UTC m=+66.636140012" lastFinishedPulling="2026-04-17 18:11:18.414483105 +0000 UTC m=+71.289875793" observedRunningTime="2026-04-17 18:11:18.929782718 +0000 UTC m=+71.805175428" watchObservedRunningTime="2026-04-17 18:11:18.929846148 +0000 UTC m=+71.805238859" Apr 17 18:11:18.943595 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:18.943549 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kgjhh" podStartSLOduration=67.317039331 podStartE2EDuration="1m11.943535226s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:11:13.777619014 +0000 UTC m=+66.653011703" lastFinishedPulling="2026-04-17 18:11:18.404114901 +0000 UTC m=+71.279507598" observedRunningTime="2026-04-17 18:11:18.942653605 +0000 UTC m=+71.818046316" watchObservedRunningTime="2026-04-17 18:11:18.943535226 +0000 UTC m=+71.818927936" Apr 17 18:11:44.914846 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:44.914805 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:11:44.914846 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:44.914847 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:11:44.915317 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:44.914950 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:11:44.915317 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:44.914959 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:11:44.915317 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:44.915010 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls podName:31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c nodeName:}" failed. No retries permitted until 2026-04-17 18:12:48.914997329 +0000 UTC m=+161.790390017 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls") pod "dns-default-xqpjb" (UID: "31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c") : secret "dns-default-metrics-tls" not found Apr 17 18:11:44.915317 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:11:44.915050 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert podName:1f9df124-3418-493f-8f5e-bd5ea9df2004 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:48.915036791 +0000 UTC m=+161.790429479 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert") pod "ingress-canary-6f76l" (UID: "1f9df124-3418-493f-8f5e-bd5ea9df2004") : secret "canary-serving-cert" not found Apr 17 18:11:49.921123 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:11:49.921092 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kgjhh" Apr 17 18:12:17.444897 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:17.444835 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:12:17.445435 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:17.444987 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:12:17.445435 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:17.445054 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs podName:2ea7fcad-19ae-42ab-8026-113afe4c2f23 nodeName:}" failed. No retries permitted until 2026-04-17 18:14:19.445039051 +0000 UTC m=+252.320431739 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs") pod "network-metrics-daemon-6d44x" (UID: "2ea7fcad-19ae-42ab-8026-113afe4c2f23") : secret "metrics-daemon-secret" not found Apr 17 18:12:21.071141 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.071105 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g"] Apr 17 18:12:21.073949 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.073932 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" Apr 17 18:12:21.076572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.076549 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:12:21.076718 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.076584 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 18:12:21.077721 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.077701 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-cc26l\"" Apr 17 18:12:21.082263 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.082242 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g"] Apr 17 18:12:21.171209 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.171177 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x"] Apr 17 18:12:21.173577 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.173553 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdm7m\" (UniqueName: \"kubernetes.io/projected/b9febd3b-c712-4be5-b213-f682bf52fa59-kube-api-access-mdm7m\") pod \"volume-data-source-validator-7c6cbb6c87-gwg7g\" (UID: \"b9febd3b-c712-4be5-b213-f682bf52fa59\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" Apr 17 18:12:21.174029 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.174007 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.176570 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.176549 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 18:12:21.176686 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.176650 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 18:12:21.176686 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.176648 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:12:21.177233 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.177216 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 18:12:21.178332 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.178312 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jqjnv\"" Apr 17 18:12:21.187842 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.187817 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x"] Apr 17 18:12:21.269625 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.269591 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz"] Apr 17 18:12:21.272327 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.272311 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.273946 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.273920 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdm7m\" (UniqueName: \"kubernetes.io/projected/b9febd3b-c712-4be5-b213-f682bf52fa59-kube-api-access-mdm7m\") pod \"volume-data-source-validator-7c6cbb6c87-gwg7g\" (UID: \"b9febd3b-c712-4be5-b213-f682bf52fa59\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" Apr 17 18:12:21.274046 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.274014 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212d9a00-537f-41e9-b6bf-b14feb7f40a5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.274088 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.274045 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66brr\" (UniqueName: \"kubernetes.io/projected/212d9a00-537f-41e9-b6bf-b14feb7f40a5-kube-api-access-66brr\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.274136 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.274114 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212d9a00-537f-41e9-b6bf-b14feb7f40a5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.274710 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.274691 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 18:12:21.274921 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.274907 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:12:21.274960 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.274917 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 18:12:21.275039 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.275020 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 18:12:21.275140 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.275125 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9tpdh\"" Apr 17 18:12:21.277349 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.277323 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m"] Apr 17 18:12:21.280570 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.280447 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7"] Apr 17 18:12:21.280696 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.280676 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.283461 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.283439 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-59b48b8f4-5st7q"] Apr 17 18:12:21.283658 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.283642 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.285372 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.285330 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 18:12:21.286889 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.286858 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hlxlw\"" Apr 17 18:12:21.288749 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.288714 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 18:12:21.288871 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.288832 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 18:12:21.289161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.289136 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 18:12:21.289444 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.289423 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:12:21.290798 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.290768 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-6l48f\"" Apr 17 18:12:21.290972 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.290955 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 18:12:21.291264 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.291205 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz"] Apr 17 18:12:21.292019 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.291343 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.292192 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.292170 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 18:12:21.293141 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.293118 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m"] Apr 17 18:12:21.293928 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.293810 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 18:12:21.294124 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294093 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 18:12:21.294234 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294213 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 18:12:21.294319 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294099 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7"] Apr 17 18:12:21.294661 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294645 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 18:12:21.294883 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294867 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 18:12:21.294947 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294881 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-m69gv\"" Apr 17 18:12:21.294947 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.294908 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 18:12:21.298414 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.298391 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-59b48b8f4-5st7q"] Apr 17 18:12:21.302038 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.301984 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdm7m\" (UniqueName: \"kubernetes.io/projected/b9febd3b-c712-4be5-b213-f682bf52fa59-kube-api-access-mdm7m\") pod \"volume-data-source-validator-7c6cbb6c87-gwg7g\" (UID: \"b9febd3b-c712-4be5-b213-f682bf52fa59\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" Apr 17 18:12:21.375485 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375383 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212d9a00-537f-41e9-b6bf-b14feb7f40a5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.375485 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375427 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66brr\" (UniqueName: \"kubernetes.io/projected/212d9a00-537f-41e9-b6bf-b14feb7f40a5-kube-api-access-66brr\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.375707 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375560 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hssxg\" (UniqueName: \"kubernetes.io/projected/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-kube-api-access-hssxg\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.375707 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375606 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5cj\" (UniqueName: \"kubernetes.io/projected/a013aac6-7414-4d90-8f7d-99be6094d204-kube-api-access-zk5cj\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.375707 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375682 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.375840 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375708 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.375840 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbfh2\" (UniqueName: \"kubernetes.io/projected/4d015ef7-3f99-4cba-b86e-3642cdc69950-kube-api-access-rbfh2\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.375840 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375771 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.375840 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375824 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212d9a00-537f-41e9-b6bf-b14feb7f40a5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.375987 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375850 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a013aac6-7414-4d90-8f7d-99be6094d204-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.375987 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375869 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a013aac6-7414-4d90-8f7d-99be6094d204-config\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.375987 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.375885 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212d9a00-537f-41e9-b6bf-b14feb7f40a5-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.378175 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.378146 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212d9a00-537f-41e9-b6bf-b14feb7f40a5-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.382432 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.382407 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" Apr 17 18:12:21.384160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.384136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66brr\" (UniqueName: \"kubernetes.io/projected/212d9a00-537f-41e9-b6bf-b14feb7f40a5-kube-api-access-66brr\") pod \"kube-storage-version-migrator-operator-6769c5d45-z2z5x\" (UID: \"212d9a00-537f-41e9-b6bf-b14feb7f40a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.476728 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476694 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-stats-auth\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.476881 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476739 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5cj\" (UniqueName: \"kubernetes.io/projected/a013aac6-7414-4d90-8f7d-99be6094d204-kube-api-access-zk5cj\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.476881 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476774 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.476881 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476790 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.476881 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476807 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbfh2\" (UniqueName: \"kubernetes.io/projected/4d015ef7-3f99-4cba-b86e-3642cdc69950-kube-api-access-rbfh2\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.476881 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476829 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.476881 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476872 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476922 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-default-certificate\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.476964 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.477025 2583 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.477079 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjb9t\" (UniqueName: \"kubernetes.io/projected/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-kube-api-access-sjb9t\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.477093 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.477112 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls podName:ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a nodeName:}" failed. No retries permitted until 2026-04-17 18:12:21.97708979 +0000 UTC m=+134.852482496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kcr7m" (UID: "ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:21.477161 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.477154 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls podName:4d015ef7-3f99-4cba-b86e-3642cdc69950 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:21.977136036 +0000 UTC m=+134.852528726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m8ck7" (UID: "4d015ef7-3f99-4cba-b86e-3642cdc69950") : secret "samples-operator-tls" not found Apr 17 18:12:21.477515 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.477181 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a013aac6-7414-4d90-8f7d-99be6094d204-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.477515 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.477220 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a013aac6-7414-4d90-8f7d-99be6094d204-config\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.477515 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.477348 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hssxg\" (UniqueName: \"kubernetes.io/projected/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-kube-api-access-hssxg\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.477666 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.477634 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.478040 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.478018 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a013aac6-7414-4d90-8f7d-99be6094d204-config\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.479450 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.479432 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a013aac6-7414-4d90-8f7d-99be6094d204-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.482989 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.482965 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" Apr 17 18:12:21.485987 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.485947 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbfh2\" (UniqueName: \"kubernetes.io/projected/4d015ef7-3f99-4cba-b86e-3642cdc69950-kube-api-access-rbfh2\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.486126 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.486070 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5cj\" (UniqueName: \"kubernetes.io/projected/a013aac6-7414-4d90-8f7d-99be6094d204-kube-api-access-zk5cj\") pod \"service-ca-operator-d6fc45fc5-mmfmz\" (UID: \"a013aac6-7414-4d90-8f7d-99be6094d204\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.486392 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.486373 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hssxg\" (UniqueName: \"kubernetes.io/projected/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-kube-api-access-hssxg\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.501238 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.501204 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g"] Apr 17 18:12:21.504414 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:21.504385 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9febd3b_c712_4be5_b213_f682bf52fa59.slice/crio-38f7980b0910908fd3e87606debb0d9c1d7e6824ba55087360d66b71c56f9f72 WatchSource:0}: Error finding container 38f7980b0910908fd3e87606debb0d9c1d7e6824ba55087360d66b71c56f9f72: Status 404 returned error can't find the container with id 38f7980b0910908fd3e87606debb0d9c1d7e6824ba55087360d66b71c56f9f72 Apr 17 18:12:21.577797 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.577756 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-stats-auth\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.577949 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.577869 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.577949 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.577919 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-default-certificate\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.577949 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.577943 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.578101 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.577974 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjb9t\" (UniqueName: \"kubernetes.io/projected/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-kube-api-access-sjb9t\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.578101 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.578056 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:22.078034186 +0000 UTC m=+134.953426876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : configmap references non-existent config key: service-ca.crt Apr 17 18:12:21.578207 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.578093 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:12:21.578207 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.578154 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:22.078137376 +0000 UTC m=+134.953530068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : secret "router-metrics-certs-default" not found Apr 17 18:12:21.580534 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.580510 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-stats-auth\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.580638 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.580510 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-default-certificate\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.584466 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.584441 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" Apr 17 18:12:21.587375 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.587352 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjb9t\" (UniqueName: \"kubernetes.io/projected/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-kube-api-access-sjb9t\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:21.600774 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.600746 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x"] Apr 17 18:12:21.603814 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:21.603781 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod212d9a00_537f_41e9_b6bf_b14feb7f40a5.slice/crio-1c127daaa970c348c4ffff3a0fdbb32121b55193df51fe5ab31cbf124f3af828 WatchSource:0}: Error finding container 1c127daaa970c348c4ffff3a0fdbb32121b55193df51fe5ab31cbf124f3af828: Status 404 returned error can't find the container with id 1c127daaa970c348c4ffff3a0fdbb32121b55193df51fe5ab31cbf124f3af828 Apr 17 18:12:21.700357 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.700327 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz"] Apr 17 18:12:21.703075 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:21.703046 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda013aac6_7414_4d90_8f7d_99be6094d204.slice/crio-70234d203c0bc2bcc92c75dc0c66365c6eabaebb947f84ccd7d7e13690616a69 WatchSource:0}: Error finding container 70234d203c0bc2bcc92c75dc0c66365c6eabaebb947f84ccd7d7e13690616a69: Status 404 returned error can't find the container with id 70234d203c0bc2bcc92c75dc0c66365c6eabaebb947f84ccd7d7e13690616a69 Apr 17 18:12:21.980744 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.980649 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:21.980744 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:21.980691 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:21.980927 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.980807 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:12:21.980927 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.980829 2583 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:21.980927 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.980874 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls podName:4d015ef7-3f99-4cba-b86e-3642cdc69950 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:22.980859003 +0000 UTC m=+135.856251692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m8ck7" (UID: "4d015ef7-3f99-4cba-b86e-3642cdc69950") : secret "samples-operator-tls" not found Apr 17 18:12:21.980927 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:21.980890 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls podName:ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a nodeName:}" failed. No retries permitted until 2026-04-17 18:12:22.98088165 +0000 UTC m=+135.856274338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kcr7m" (UID: "ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:22.036407 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.036366 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" event={"ID":"b9febd3b-c712-4be5-b213-f682bf52fa59","Type":"ContainerStarted","Data":"38f7980b0910908fd3e87606debb0d9c1d7e6824ba55087360d66b71c56f9f72"} Apr 17 18:12:22.037392 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.037359 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" event={"ID":"a013aac6-7414-4d90-8f7d-99be6094d204","Type":"ContainerStarted","Data":"70234d203c0bc2bcc92c75dc0c66365c6eabaebb947f84ccd7d7e13690616a69"} Apr 17 18:12:22.038215 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.038195 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" event={"ID":"212d9a00-537f-41e9-b6bf-b14feb7f40a5","Type":"ContainerStarted","Data":"1c127daaa970c348c4ffff3a0fdbb32121b55193df51fe5ab31cbf124f3af828"} Apr 17 18:12:22.081384 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.081349 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:22.081769 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.081401 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:22.081769 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.081523 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:12:22.081769 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.081524 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:23.081504682 +0000 UTC m=+135.956897371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : configmap references non-existent config key: service-ca.crt Apr 17 18:12:22.081769 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.081580 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:23.081563601 +0000 UTC m=+135.956956290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : secret "router-metrics-certs-default" not found Apr 17 18:12:22.990495 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.990455 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:22.990781 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:22.990503 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:22.990781 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.990604 2583 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:22.990781 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.990618 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:12:22.990781 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.990669 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls podName:ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a nodeName:}" failed. No retries permitted until 2026-04-17 18:12:24.990647795 +0000 UTC m=+137.866040487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kcr7m" (UID: "ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:22.990781 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:22.990743 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls podName:4d015ef7-3f99-4cba-b86e-3642cdc69950 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:24.990725375 +0000 UTC m=+137.866118066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m8ck7" (UID: "4d015ef7-3f99-4cba-b86e-3642cdc69950") : secret "samples-operator-tls" not found Apr 17 18:12:23.091680 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:23.091624 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:23.092159 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:23.091792 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:23.092159 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:23.091813 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:12:23.092159 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:23.091894 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:25.091869889 +0000 UTC m=+137.967262614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : secret "router-metrics-certs-default" not found Apr 17 18:12:23.092159 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:23.091942 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:25.09192643 +0000 UTC m=+137.967319126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : configmap references non-existent config key: service-ca.crt Apr 17 18:12:25.009133 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.009089 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:25.009133 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.009131 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:25.009707 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.009267 2583 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:25.009707 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.009265 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:12:25.009707 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.009341 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls podName:ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a nodeName:}" failed. No retries permitted until 2026-04-17 18:12:29.00932758 +0000 UTC m=+141.884720267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kcr7m" (UID: "ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:25.009707 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.009365 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls podName:4d015ef7-3f99-4cba-b86e-3642cdc69950 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:29.009346779 +0000 UTC m=+141.884739471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m8ck7" (UID: "4d015ef7-3f99-4cba-b86e-3642cdc69950") : secret "samples-operator-tls" not found Apr 17 18:12:25.047396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.047361 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" event={"ID":"212d9a00-537f-41e9-b6bf-b14feb7f40a5","Type":"ContainerStarted","Data":"eb13766cd85b4aba39e53bcaf6e79408772fe9ba3e47c3d88423260633754910"} Apr 17 18:12:25.048860 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.048829 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" event={"ID":"b9febd3b-c712-4be5-b213-f682bf52fa59","Type":"ContainerStarted","Data":"31a253a33342ab4e94d3f198e72c031a87e95ee8f51393db9a854b5768881b3f"} Apr 17 18:12:25.050181 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.050153 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" event={"ID":"a013aac6-7414-4d90-8f7d-99be6094d204","Type":"ContainerStarted","Data":"39497d3ebb3c1fabd34bb74d51307a57f6d56fa8b3905c6196afa7e532296f7d"} Apr 17 18:12:25.063013 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.062965 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" podStartSLOduration=1.523245218 podStartE2EDuration="4.062951663s" podCreationTimestamp="2026-04-17 18:12:21 +0000 UTC" firstStartedPulling="2026-04-17 18:12:21.605761012 +0000 UTC m=+134.481153699" lastFinishedPulling="2026-04-17 18:12:24.145467456 +0000 UTC m=+137.020860144" observedRunningTime="2026-04-17 18:12:25.062395818 +0000 UTC m=+137.937788528" watchObservedRunningTime="2026-04-17 18:12:25.062951663 +0000 UTC m=+137.938344373" Apr 17 18:12:25.076860 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.076807 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gwg7g" podStartSLOduration=1.444616566 podStartE2EDuration="4.076791986s" podCreationTimestamp="2026-04-17 18:12:21 +0000 UTC" firstStartedPulling="2026-04-17 18:12:21.506376049 +0000 UTC m=+134.381768741" lastFinishedPulling="2026-04-17 18:12:24.138551471 +0000 UTC m=+137.013944161" observedRunningTime="2026-04-17 18:12:25.076097569 +0000 UTC m=+137.951490280" watchObservedRunningTime="2026-04-17 18:12:25.076791986 +0000 UTC m=+137.952184690" Apr 17 18:12:25.096051 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.095995 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" podStartSLOduration=1.6517063730000001 podStartE2EDuration="4.095973443s" podCreationTimestamp="2026-04-17 18:12:21 +0000 UTC" firstStartedPulling="2026-04-17 18:12:21.705043648 +0000 UTC m=+134.580436337" lastFinishedPulling="2026-04-17 18:12:24.149310715 +0000 UTC m=+137.024703407" observedRunningTime="2026-04-17 18:12:25.094427758 +0000 UTC m=+137.969820484" watchObservedRunningTime="2026-04-17 18:12:25.095973443 +0000 UTC m=+137.971366154" Apr 17 18:12:25.109957 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.109927 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:25.110104 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:25.110042 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:25.110104 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.110086 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:12:25.110204 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.110156 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:29.110139173 +0000 UTC m=+141.985531875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : configmap references non-existent config key: service-ca.crt Apr 17 18:12:25.110204 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:25.110171 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:29.110165246 +0000 UTC m=+141.985557933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : secret "router-metrics-certs-default" not found Apr 17 18:12:28.745367 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:28.745336 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rl4d7_854b67d6-4dbb-4558-9444-235eb53b9278/dns-node-resolver/0.log" Apr 17 18:12:29.040377 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:29.040291 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:29.040377 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:29.040330 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:29.040558 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.040433 2583 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:29.040558 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.040448 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 18:12:29.040558 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.040488 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls podName:ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a nodeName:}" failed. No retries permitted until 2026-04-17 18:12:37.040474806 +0000 UTC m=+149.915867494 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kcr7m" (UID: "ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:29.040558 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.040507 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls podName:4d015ef7-3f99-4cba-b86e-3642cdc69950 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:37.040493827 +0000 UTC m=+149.915886516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m8ck7" (UID: "4d015ef7-3f99-4cba-b86e-3642cdc69950") : secret "samples-operator-tls" not found Apr 17 18:12:29.141220 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:29.141171 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:29.141422 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:29.141232 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:29.141422 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.141368 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:37.141350691 +0000 UTC m=+150.016743396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : configmap references non-existent config key: service-ca.crt Apr 17 18:12:29.141422 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.141387 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 18:12:29.141532 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:29.141434 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs podName:e8dffa5d-56ee-4b70-a3be-9aa2dea734cd nodeName:}" failed. No retries permitted until 2026-04-17 18:12:37.141423121 +0000 UTC m=+150.016815808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs") pod "router-default-59b48b8f4-5st7q" (UID: "e8dffa5d-56ee-4b70-a3be-9aa2dea734cd") : secret "router-metrics-certs-default" not found Apr 17 18:12:29.545717 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:29.545690 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8pwg5_75b35535-d264-462c-a620-1b59e57c1eef/node-ca/0.log" Apr 17 18:12:31.147540 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:31.147509 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z2z5x_212d9a00-537f-41e9-b6bf-b14feb7f40a5/kube-storage-version-migrator-operator/0.log" Apr 17 18:12:37.105867 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.105811 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:37.105867 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.105866 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:37.106413 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:37.105947 2583 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:37.106413 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:37.106014 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls podName:ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a nodeName:}" failed. No retries permitted until 2026-04-17 18:12:53.105998733 +0000 UTC m=+165.981391438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kcr7m" (UID: "ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a") : secret "cluster-monitoring-operator-tls" not found Apr 17 18:12:37.108334 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.108315 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4d015ef7-3f99-4cba-b86e-3642cdc69950-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m8ck7\" (UID: \"4d015ef7-3f99-4cba-b86e-3642cdc69950\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:37.206538 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.206500 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:37.206713 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.206604 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:37.207165 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.207146 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-service-ca-bundle\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:37.208706 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.208688 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" Apr 17 18:12:37.208953 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.208934 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8dffa5d-56ee-4b70-a3be-9aa2dea734cd-metrics-certs\") pod \"router-default-59b48b8f4-5st7q\" (UID: \"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd\") " pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:37.214488 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.214465 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:37.332533 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.332499 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7"] Apr 17 18:12:37.356425 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:37.356399 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-59b48b8f4-5st7q"] Apr 17 18:12:37.358622 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:37.358595 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8dffa5d_56ee_4b70_a3be_9aa2dea734cd.slice/crio-f22c0ad334d6244fac11f04c0fea92e788b4982a12b23973c4e866221994edec WatchSource:0}: Error finding container f22c0ad334d6244fac11f04c0fea92e788b4982a12b23973c4e866221994edec: Status 404 returned error can't find the container with id f22c0ad334d6244fac11f04c0fea92e788b4982a12b23973c4e866221994edec Apr 17 18:12:38.076895 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:38.076858 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-59b48b8f4-5st7q" event={"ID":"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd","Type":"ContainerStarted","Data":"b3ac0737e315fb600dd35e85c2c2f6b9e6a7bd51c6cf294e2825ff7238150926"} Apr 17 18:12:38.077068 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:38.076903 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-59b48b8f4-5st7q" event={"ID":"e8dffa5d-56ee-4b70-a3be-9aa2dea734cd","Type":"ContainerStarted","Data":"f22c0ad334d6244fac11f04c0fea92e788b4982a12b23973c4e866221994edec"} Apr 17 18:12:38.077913 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:38.077887 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" event={"ID":"4d015ef7-3f99-4cba-b86e-3642cdc69950","Type":"ContainerStarted","Data":"2745c016ecdebf5b5c44336bf6c52b73ca02a0ef0bd1132ee0d2070e4e1ce700"} Apr 17 18:12:38.096080 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:38.096000 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-59b48b8f4-5st7q" podStartSLOduration=17.095984026 podStartE2EDuration="17.095984026s" podCreationTimestamp="2026-04-17 18:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:38.094459081 +0000 UTC m=+150.969851805" watchObservedRunningTime="2026-04-17 18:12:38.095984026 +0000 UTC m=+150.971376737" Apr 17 18:12:38.215219 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:38.215179 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:38.218048 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:38.218024 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:39.082516 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:39.082419 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" event={"ID":"4d015ef7-3f99-4cba-b86e-3642cdc69950","Type":"ContainerStarted","Data":"4447c80afd48da978402040812fa9c2cdd6a585d34499d2e2bee3eba3618570a"} Apr 17 18:12:39.082516 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:39.082470 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" event={"ID":"4d015ef7-3f99-4cba-b86e-3642cdc69950","Type":"ContainerStarted","Data":"1de724ce6c3a99832dbbc846c0202a0e3e8776839b67cc77013f992f09a87868"} Apr 17 18:12:39.082720 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:39.082686 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:39.083905 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:39.083884 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-59b48b8f4-5st7q" Apr 17 18:12:39.099148 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:39.099107 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m8ck7" podStartSLOduration=16.667852825 podStartE2EDuration="18.099097142s" podCreationTimestamp="2026-04-17 18:12:21 +0000 UTC" firstStartedPulling="2026-04-17 18:12:37.370382057 +0000 UTC m=+150.245774745" lastFinishedPulling="2026-04-17 18:12:38.801626371 +0000 UTC m=+151.677019062" observedRunningTime="2026-04-17 18:12:39.098193275 +0000 UTC m=+151.973585986" watchObservedRunningTime="2026-04-17 18:12:39.099097142 +0000 UTC m=+151.974489851" Apr 17 18:12:43.994841 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:43.994788 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xqpjb" podUID="31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c" Apr 17 18:12:44.010960 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:44.010926 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6f76l" podUID="1f9df124-3418-493f-8f5e-bd5ea9df2004" Apr 17 18:12:44.097612 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:44.097584 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xqpjb" Apr 17 18:12:45.732900 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:12:45.732848 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6d44x" podUID="2ea7fcad-19ae-42ab-8026-113afe4c2f23" Apr 17 18:12:47.508544 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.508510 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bmqm9"] Apr 17 18:12:47.514509 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.514482 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b86858b4b-z8c9s"] Apr 17 18:12:47.514676 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.514658 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.517194 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.517173 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s42m7\"" Apr 17 18:12:47.517194 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.517189 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 18:12:47.517524 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.517506 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 18:12:47.517606 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.517588 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 18:12:47.517687 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.517672 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 18:12:47.518579 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.518556 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.520874 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.520856 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 18:12:47.520874 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.520869 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 18:12:47.521448 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.521431 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wbd55\"" Apr 17 18:12:47.521531 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.521483 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 18:12:47.526138 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.526114 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bmqm9"] Apr 17 18:12:47.532513 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.532485 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b86858b4b-z8c9s"] Apr 17 18:12:47.532696 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.532679 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 18:12:47.587178 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587134 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-ca-trust-extracted\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587178 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587182 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-registry-certificates\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587433 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587212 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-image-registry-private-configuration\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587433 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587255 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-bound-sa-token\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587433 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587328 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-installation-pull-secrets\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587554 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587431 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b6f8c7ee-24c2-4294-928d-435c48c0b667-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.587554 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587454 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-trusted-ca\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587554 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587489 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fsx5\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-kube-api-access-6fsx5\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587554 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587517 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b6f8c7ee-24c2-4294-928d-435c48c0b667-crio-socket\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.587696 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b6f8c7ee-24c2-4294-928d-435c48c0b667-data-volume\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.587696 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587584 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b6f8c7ee-24c2-4294-928d-435c48c0b667-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.587696 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587619 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-registry-tls\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.587696 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.587644 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr55j\" (UniqueName: \"kubernetes.io/projected/b6f8c7ee-24c2-4294-928d-435c48c0b667-kube-api-access-gr55j\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.688465 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688433 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-ca-trust-extracted\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688465 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688469 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-registry-certificates\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688487 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-image-registry-private-configuration\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688511 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-bound-sa-token\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688551 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-installation-pull-secrets\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688576 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b6f8c7ee-24c2-4294-928d-435c48c0b667-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688592 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-trusted-ca\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688610 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fsx5\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-kube-api-access-6fsx5\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688638 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b6f8c7ee-24c2-4294-928d-435c48c0b667-crio-socket\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.688715 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688713 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b6f8c7ee-24c2-4294-928d-435c48c0b667-crio-socket\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.689129 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688899 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b6f8c7ee-24c2-4294-928d-435c48c0b667-data-volume\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.689129 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.688903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-ca-trust-extracted\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.689129 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.689057 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b6f8c7ee-24c2-4294-928d-435c48c0b667-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.689129 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.689117 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-registry-tls\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.689387 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.689164 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gr55j\" (UniqueName: \"kubernetes.io/projected/b6f8c7ee-24c2-4294-928d-435c48c0b667-kube-api-access-gr55j\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.689387 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.689234 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b6f8c7ee-24c2-4294-928d-435c48c0b667-data-volume\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.689595 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.689574 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-registry-certificates\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.689903 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.689847 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b6f8c7ee-24c2-4294-928d-435c48c0b667-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.690635 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.690612 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-trusted-ca\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.691429 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.691407 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b6f8c7ee-24c2-4294-928d-435c48c0b667-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.691841 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.691816 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-image-registry-private-configuration\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.691926 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.691867 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-registry-tls\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.692055 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.692029 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-installation-pull-secrets\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.698536 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.698515 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-bound-sa-token\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.698630 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.698620 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fsx5\" (UniqueName: \"kubernetes.io/projected/ff33bbab-bce2-44ed-9c40-a35c751f8fa7-kube-api-access-6fsx5\") pod \"image-registry-b86858b4b-z8c9s\" (UID: \"ff33bbab-bce2-44ed-9c40-a35c751f8fa7\") " pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.698978 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.698949 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr55j\" (UniqueName: \"kubernetes.io/projected/b6f8c7ee-24c2-4294-928d-435c48c0b667-kube-api-access-gr55j\") pod \"insights-runtime-extractor-bmqm9\" (UID: \"b6f8c7ee-24c2-4294-928d-435c48c0b667\") " pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.824892 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.824856 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bmqm9" Apr 17 18:12:47.830727 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.830703 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:47.959420 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.959394 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bmqm9"] Apr 17 18:12:47.978632 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:47.978604 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b86858b4b-z8c9s"] Apr 17 18:12:47.982003 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:47.981977 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff33bbab_bce2_44ed_9c40_a35c751f8fa7.slice/crio-bd06d2a4b42a5cbd482d44591b661a6ab8fc845a0f23179bff58c3d49e91dce1 WatchSource:0}: Error finding container bd06d2a4b42a5cbd482d44591b661a6ab8fc845a0f23179bff58c3d49e91dce1: Status 404 returned error can't find the container with id bd06d2a4b42a5cbd482d44591b661a6ab8fc845a0f23179bff58c3d49e91dce1 Apr 17 18:12:48.109209 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:48.109109 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bmqm9" event={"ID":"b6f8c7ee-24c2-4294-928d-435c48c0b667","Type":"ContainerStarted","Data":"21cb685b4a4a6566efb33d9d8b9e02f64b6255ca23128fd7842aec17eb7c25a7"} Apr 17 18:12:48.109209 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:48.109157 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bmqm9" event={"ID":"b6f8c7ee-24c2-4294-928d-435c48c0b667","Type":"ContainerStarted","Data":"3e454f0e8585d3bb18c4ed0362c0e5c64491270e1c39e7b59c8922b4c727950e"} Apr 17 18:12:48.110438 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:48.110411 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" event={"ID":"ff33bbab-bce2-44ed-9c40-a35c751f8fa7","Type":"ContainerStarted","Data":"722dadfc5758f7ba8ad494843c02e73640efe340d39dc9791eef70262826a18d"} Apr 17 18:12:48.110570 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:48.110440 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" event={"ID":"ff33bbab-bce2-44ed-9c40-a35c751f8fa7","Type":"ContainerStarted","Data":"bd06d2a4b42a5cbd482d44591b661a6ab8fc845a0f23179bff58c3d49e91dce1"} Apr 17 18:12:48.110570 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:48.110536 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:12:48.129030 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:48.128981 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" podStartSLOduration=1.128963638 podStartE2EDuration="1.128963638s" podCreationTimestamp="2026-04-17 18:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:48.128471513 +0000 UTC m=+161.003864223" watchObservedRunningTime="2026-04-17 18:12:48.128963638 +0000 UTC m=+161.004356365" Apr 17 18:12:49.000542 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.000462 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:12:49.000542 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.000523 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:12:49.002978 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.002957 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c-metrics-tls\") pod \"dns-default-xqpjb\" (UID: \"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c\") " pod="openshift-dns/dns-default-xqpjb" Apr 17 18:12:49.003115 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.003094 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1f9df124-3418-493f-8f5e-bd5ea9df2004-cert\") pod \"ingress-canary-6f76l\" (UID: \"1f9df124-3418-493f-8f5e-bd5ea9df2004\") " pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:12:49.115109 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.115054 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bmqm9" event={"ID":"b6f8c7ee-24c2-4294-928d-435c48c0b667","Type":"ContainerStarted","Data":"767eab4483a958f77358af4e2b1092f79ee66eccdf667729ae2347dc29aa3c9e"} Apr 17 18:12:49.201247 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.201217 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-46dtx\"" Apr 17 18:12:49.209375 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.209339 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xqpjb" Apr 17 18:12:49.363406 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:49.363372 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xqpjb"] Apr 17 18:12:49.366288 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:49.366231 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31cfe9ce_5d53_4e66_b4f4_bf27d97a6a1c.slice/crio-0fa1c114ba3677097cf0f413598244e086fc539886ee4721e6396f0e857bce59 WatchSource:0}: Error finding container 0fa1c114ba3677097cf0f413598244e086fc539886ee4721e6396f0e857bce59: Status 404 returned error can't find the container with id 0fa1c114ba3677097cf0f413598244e086fc539886ee4721e6396f0e857bce59 Apr 17 18:12:50.118826 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:50.118787 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xqpjb" event={"ID":"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c","Type":"ContainerStarted","Data":"0fa1c114ba3677097cf0f413598244e086fc539886ee4721e6396f0e857bce59"} Apr 17 18:12:51.122556 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:51.122507 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bmqm9" event={"ID":"b6f8c7ee-24c2-4294-928d-435c48c0b667","Type":"ContainerStarted","Data":"1fa79ff1f63df3792eb8fe2611e850d15ee0d99445aa82238bb39b59c828a558"} Apr 17 18:12:51.123731 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:51.123709 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xqpjb" event={"ID":"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c","Type":"ContainerStarted","Data":"e1da8524efc502dfef862cef35cb66377a448424bee7f2280c4871d2ef425f23"} Apr 17 18:12:51.141367 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:51.141312 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bmqm9" podStartSLOduration=2.103261637 podStartE2EDuration="4.141297628s" podCreationTimestamp="2026-04-17 18:12:47 +0000 UTC" firstStartedPulling="2026-04-17 18:12:48.036952644 +0000 UTC m=+160.912345332" lastFinishedPulling="2026-04-17 18:12:50.074988631 +0000 UTC m=+162.950381323" observedRunningTime="2026-04-17 18:12:51.140873151 +0000 UTC m=+164.016265861" watchObservedRunningTime="2026-04-17 18:12:51.141297628 +0000 UTC m=+164.016690338" Apr 17 18:12:52.127955 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:52.127916 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xqpjb" event={"ID":"31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c","Type":"ContainerStarted","Data":"f2cf19e32e24df287f3715d156ee409d90fe58fa2fb8293a08a96ad940c5b006"} Apr 17 18:12:52.145834 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:52.145777 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xqpjb" podStartSLOduration=130.585776662 podStartE2EDuration="2m12.145763665s" podCreationTimestamp="2026-04-17 18:10:40 +0000 UTC" firstStartedPulling="2026-04-17 18:12:49.368389798 +0000 UTC m=+162.243782487" lastFinishedPulling="2026-04-17 18:12:50.9283768 +0000 UTC m=+163.803769490" observedRunningTime="2026-04-17 18:12:52.144532131 +0000 UTC m=+165.019924840" watchObservedRunningTime="2026-04-17 18:12:52.145763665 +0000 UTC m=+165.021156375" Apr 17 18:12:53.130702 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:53.130661 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:53.131073 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:53.130704 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xqpjb" Apr 17 18:12:53.133053 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:53.133027 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kcr7m\" (UID: \"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:53.398225 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:53.398140 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" Apr 17 18:12:53.534686 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:53.534649 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m"] Apr 17 18:12:53.537640 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:53.537607 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac6d7493_5df5_4aa0_8274_2bfb5f2d3b8a.slice/crio-2a3e3796482d04edfa5c6fe0e33629651bc5bdd472a80e28874b068f410a3a48 WatchSource:0}: Error finding container 2a3e3796482d04edfa5c6fe0e33629651bc5bdd472a80e28874b068f410a3a48: Status 404 returned error can't find the container with id 2a3e3796482d04edfa5c6fe0e33629651bc5bdd472a80e28874b068f410a3a48 Apr 17 18:12:54.136895 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:54.136810 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" event={"ID":"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a","Type":"ContainerStarted","Data":"2a3e3796482d04edfa5c6fe0e33629651bc5bdd472a80e28874b068f410a3a48"} Apr 17 18:12:56.143309 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:56.143254 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" event={"ID":"ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a","Type":"ContainerStarted","Data":"ffd25bb12480bba4836222b98f8e6c2c4a328f54f760d5786bf41b2cf771fe64"} Apr 17 18:12:56.160122 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:56.160069 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kcr7m" podStartSLOduration=33.497964767 podStartE2EDuration="35.160057562s" podCreationTimestamp="2026-04-17 18:12:21 +0000 UTC" firstStartedPulling="2026-04-17 18:12:53.539506709 +0000 UTC m=+166.414899397" lastFinishedPulling="2026-04-17 18:12:55.201599501 +0000 UTC m=+168.076992192" observedRunningTime="2026-04-17 18:12:56.15902339 +0000 UTC m=+169.034416099" watchObservedRunningTime="2026-04-17 18:12:56.160057562 +0000 UTC m=+169.035450272" Apr 17 18:12:59.703487 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:59.703447 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:12:59.703914 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:59.703561 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:12:59.706336 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:59.706312 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hwgzd\"" Apr 17 18:12:59.714284 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:59.714255 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6f76l" Apr 17 18:12:59.833645 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:12:59.833613 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6f76l"] Apr 17 18:12:59.836435 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:12:59.836403 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9df124_3418_493f_8f5e_bd5ea9df2004.slice/crio-383a7424a1613e305feccc34573546767ff966be0f52bdf85f102d1f71169777 WatchSource:0}: Error finding container 383a7424a1613e305feccc34573546767ff966be0f52bdf85f102d1f71169777: Status 404 returned error can't find the container with id 383a7424a1613e305feccc34573546767ff966be0f52bdf85f102d1f71169777 Apr 17 18:13:00.154258 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:00.154221 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6f76l" event={"ID":"1f9df124-3418-493f-8f5e-bd5ea9df2004","Type":"ContainerStarted","Data":"383a7424a1613e305feccc34573546767ff966be0f52bdf85f102d1f71169777"} Apr 17 18:13:02.167482 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:02.167438 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6f76l" event={"ID":"1f9df124-3418-493f-8f5e-bd5ea9df2004","Type":"ContainerStarted","Data":"5233db4c4271487520e349eeb20ed0fc45927fd5c8b453c045b030f1779cec59"} Apr 17 18:13:02.183198 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:02.183145 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6f76l" podStartSLOduration=140.654338948 podStartE2EDuration="2m22.183131296s" podCreationTimestamp="2026-04-17 18:10:40 +0000 UTC" firstStartedPulling="2026-04-17 18:12:59.838334747 +0000 UTC m=+172.713727436" lastFinishedPulling="2026-04-17 18:13:01.367127093 +0000 UTC m=+174.242519784" observedRunningTime="2026-04-17 18:13:02.181917475 +0000 UTC m=+175.057310186" watchObservedRunningTime="2026-04-17 18:13:02.183131296 +0000 UTC m=+175.058524006" Apr 17 18:13:03.139801 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:03.139768 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xqpjb" Apr 17 18:13:05.854037 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.854002 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6df67fc58c-smvkq"] Apr 17 18:13:05.856036 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.856017 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.858633 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.858612 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 18:13:05.858761 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.858663 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 18:13:05.859655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.859632 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 18:13:05.859753 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.859655 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 18:13:05.859753 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.859722 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 18:13:05.859866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.859788 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lwnmm\"" Apr 17 18:13:05.859866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.859816 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 18:13:05.859962 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.859919 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 18:13:05.865148 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.865131 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 18:13:05.868458 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.868369 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df67fc58c-smvkq"] Apr 17 18:13:05.925033 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925001 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-oauth-serving-cert\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.925033 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925048 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-serving-cert\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.925315 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925144 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-config\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.925315 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925187 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-oauth-config\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.925315 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925236 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-service-ca\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.925315 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925261 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-trusted-ca-bundle\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:05.925315 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:05.925313 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z97jm\" (UniqueName: \"kubernetes.io/projected/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-kube-api-access-z97jm\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026663 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026622 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-config\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026674 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-oauth-config\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026715 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-service-ca\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026735 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-trusted-ca-bundle\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026758 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z97jm\" (UniqueName: \"kubernetes.io/projected/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-kube-api-access-z97jm\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026791 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-oauth-serving-cert\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.026866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.026836 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-serving-cert\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.027600 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.027561 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-service-ca\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.027720 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.027561 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-config\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.027720 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.027650 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-oauth-serving-cert\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.028398 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.028378 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-trusted-ca-bundle\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.029539 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.029515 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-serving-cert\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.029629 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.029566 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-oauth-config\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.035922 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.035898 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z97jm\" (UniqueName: \"kubernetes.io/projected/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-kube-api-access-z97jm\") pod \"console-6df67fc58c-smvkq\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.165102 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.165013 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:06.291513 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:06.291475 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6df67fc58c-smvkq"] Apr 17 18:13:06.294401 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:13:06.294367 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8579aae1_f6b0_45e0_abdb_41973cc8a6d9.slice/crio-5a0b8fa99b2166f1f0755dc1b8e0da0ae3aea430520030fe8c52d0b3f00c8155 WatchSource:0}: Error finding container 5a0b8fa99b2166f1f0755dc1b8e0da0ae3aea430520030fe8c52d0b3f00c8155: Status 404 returned error can't find the container with id 5a0b8fa99b2166f1f0755dc1b8e0da0ae3aea430520030fe8c52d0b3f00c8155 Apr 17 18:13:07.186242 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:07.186200 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df67fc58c-smvkq" event={"ID":"8579aae1-f6b0-45e0-abdb-41973cc8a6d9","Type":"ContainerStarted","Data":"5a0b8fa99b2166f1f0755dc1b8e0da0ae3aea430520030fe8c52d0b3f00c8155"} Apr 17 18:13:09.119204 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:09.119172 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b86858b4b-z8c9s" Apr 17 18:13:09.193670 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:09.193640 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df67fc58c-smvkq" event={"ID":"8579aae1-f6b0-45e0-abdb-41973cc8a6d9","Type":"ContainerStarted","Data":"d4323e3dafb6d9841ce749659f9ec4049f6958dffbd76ac958d568f48afedd69"} Apr 17 18:13:09.216880 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:09.216827 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6df67fc58c-smvkq" podStartSLOduration=1.750625865 podStartE2EDuration="4.21681296s" podCreationTimestamp="2026-04-17 18:13:05 +0000 UTC" firstStartedPulling="2026-04-17 18:13:06.296233645 +0000 UTC m=+179.171626333" lastFinishedPulling="2026-04-17 18:13:08.762420738 +0000 UTC m=+181.637813428" observedRunningTime="2026-04-17 18:13:09.215044604 +0000 UTC m=+182.090437324" watchObservedRunningTime="2026-04-17 18:13:09.21681296 +0000 UTC m=+182.092205670" Apr 17 18:13:10.110196 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.110162 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jljrv"] Apr 17 18:13:10.112503 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.112484 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.116380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.116342 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 18:13:10.116380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.116360 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 18:13:10.116568 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.116393 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 18:13:10.116568 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.116405 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4t74p\"" Apr 17 18:13:10.116568 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.116352 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 18:13:10.130856 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.130829 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pqdwp"] Apr 17 18:13:10.132915 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.132891 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jljrv"] Apr 17 18:13:10.133026 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.132996 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.135681 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.135660 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 18:13:10.135967 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.135948 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 18:13:10.136042 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.135970 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f528l\"" Apr 17 18:13:10.136042 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.135952 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 18:13:10.160471 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.160430 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.160639 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.160486 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p4n8\" (UniqueName: \"kubernetes.io/projected/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-api-access-7p4n8\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.160639 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.160530 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5b54589-b3db-48ee-8c9e-111a4446a476-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.160639 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.160556 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.160639 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.160589 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5b54589-b3db-48ee-8c9e-111a4446a476-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.160844 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.160646 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261022 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.260977 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261022 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261026 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qt9k\" (UniqueName: \"kubernetes.io/projected/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-kube-api-access-2qt9k\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261317 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p4n8\" (UniqueName: \"kubernetes.io/projected/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-api-access-7p4n8\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261317 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261167 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5b54589-b3db-48ee-8c9e-111a4446a476-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261317 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261207 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261317 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-sys\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261514 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:13:10.261358 2583 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 18:13:10.261514 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261431 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5b54589-b3db-48ee-8c9e-111a4446a476-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261514 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:13:10.261448 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-tls podName:a5b54589-b3db-48ee-8c9e-111a4446a476 nodeName:}" failed. No retries permitted until 2026-04-17 18:13:10.761426306 +0000 UTC m=+183.636819007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jljrv" (UID: "a5b54589-b3db-48ee-8c9e-111a4446a476") : secret "kube-state-metrics-tls" not found Apr 17 18:13:10.261514 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261493 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261703 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261621 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-textfile\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261744 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261722 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-metrics-client-ca\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261818 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261803 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261874 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261827 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-root\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261874 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261846 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-wtmp\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.261963 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261874 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a5b54589-b3db-48ee-8c9e-111a4446a476-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.261963 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261888 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-tls\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.262032 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261947 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.262032 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.261967 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5b54589-b3db-48ee-8c9e-111a4446a476-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.262107 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.262091 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.264332 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.264304 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.272255 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.272232 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p4n8\" (UniqueName: \"kubernetes.io/projected/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-api-access-7p4n8\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.362943 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.362859 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.362943 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.362903 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-root\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.362943 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.362926 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-wtmp\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.362956 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-tls\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.362997 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363000 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-root\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363025 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qt9k\" (UniqueName: \"kubernetes.io/projected/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-kube-api-access-2qt9k\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363079 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-wtmp\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363144 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-sys\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363207 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:13:10.363155 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 18:13:10.363540 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363209 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-textfile\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363540 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363209 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-sys\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363540 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:13:10.363235 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-tls podName:93258dde-b8e4-45f7-a919-f6cb6b76e9b2 nodeName:}" failed. No retries permitted until 2026-04-17 18:13:10.863214094 +0000 UTC m=+183.738606786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-tls") pod "node-exporter-pqdwp" (UID: "93258dde-b8e4-45f7-a919-f6cb6b76e9b2") : secret "node-exporter-tls" not found Apr 17 18:13:10.363540 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363321 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-metrics-client-ca\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363540 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363455 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-textfile\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363763 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363635 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.363839 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.363822 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-metrics-client-ca\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.365596 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.365577 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.372815 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.372792 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qt9k\" (UniqueName: \"kubernetes.io/projected/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-kube-api-access-2qt9k\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.766612 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.766535 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.769170 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.769147 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5b54589-b3db-48ee-8c9e-111a4446a476-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jljrv\" (UID: \"a5b54589-b3db-48ee-8c9e-111a4446a476\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:10.867639 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.867602 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-tls\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:10.870147 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:10.870124 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93258dde-b8e4-45f7-a919-f6cb6b76e9b2-node-exporter-tls\") pod \"node-exporter-pqdwp\" (UID: \"93258dde-b8e4-45f7-a919-f6cb6b76e9b2\") " pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:11.022266 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:11.022182 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" Apr 17 18:13:11.042193 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:11.042160 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pqdwp" Apr 17 18:13:11.052731 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:13:11.052699 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93258dde_b8e4_45f7_a919_f6cb6b76e9b2.slice/crio-c280ff6d96198498b112762c69ab6f59da1818ada795a7edc5d4371783db093b WatchSource:0}: Error finding container c280ff6d96198498b112762c69ab6f59da1818ada795a7edc5d4371783db093b: Status 404 returned error can't find the container with id c280ff6d96198498b112762c69ab6f59da1818ada795a7edc5d4371783db093b Apr 17 18:13:11.154444 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:11.154411 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jljrv"] Apr 17 18:13:11.157181 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:13:11.157152 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b54589_b3db_48ee_8c9e_111a4446a476.slice/crio-e7a87808dec0ff0511f462d0ac53c204be9b388d100918861966843d08af9471 WatchSource:0}: Error finding container e7a87808dec0ff0511f462d0ac53c204be9b388d100918861966843d08af9471: Status 404 returned error can't find the container with id e7a87808dec0ff0511f462d0ac53c204be9b388d100918861966843d08af9471 Apr 17 18:13:11.200388 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:11.200351 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pqdwp" event={"ID":"93258dde-b8e4-45f7-a919-f6cb6b76e9b2","Type":"ContainerStarted","Data":"c280ff6d96198498b112762c69ab6f59da1818ada795a7edc5d4371783db093b"} Apr 17 18:13:11.201517 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:11.201493 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" event={"ID":"a5b54589-b3db-48ee-8c9e-111a4446a476","Type":"ContainerStarted","Data":"e7a87808dec0ff0511f462d0ac53c204be9b388d100918861966843d08af9471"} Apr 17 18:13:12.087622 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.087543 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-59d88d4c87-s2p4x"] Apr 17 18:13:12.090898 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.090882 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.093310 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093262 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 18:13:12.093461 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093268 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 18:13:12.093461 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093321 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 18:13:12.093777 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093759 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-54lht\"" Apr 17 18:13:12.093877 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093788 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5hd1n76m0fnqn\"" Apr 17 18:13:12.093877 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093787 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 18:13:12.093877 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.093788 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 18:13:12.100007 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.099984 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-59d88d4c87-s2p4x"] Apr 17 18:13:12.178855 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.178822 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.178855 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.178865 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.179337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.178889 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-grpc-tls\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.179337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.178949 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54c06acc-691c-41df-b0a5-5e3bed98acf5-metrics-client-ca\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.179337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.179049 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.179337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.179089 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5zr\" (UniqueName: \"kubernetes.io/projected/54c06acc-691c-41df-b0a5-5e3bed98acf5-kube-api-access-zq5zr\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.179337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.179119 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.179337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.179232 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-tls\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.206187 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.206157 2583 generic.go:358] "Generic (PLEG): container finished" podID="93258dde-b8e4-45f7-a919-f6cb6b76e9b2" containerID="54e2a908543f63a3a755d4f92cb94f1a713b2aa83e562821fd784e3a3655779c" exitCode=0 Apr 17 18:13:12.206351 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.206212 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pqdwp" event={"ID":"93258dde-b8e4-45f7-a919-f6cb6b76e9b2","Type":"ContainerDied","Data":"54e2a908543f63a3a755d4f92cb94f1a713b2aa83e562821fd784e3a3655779c"} Apr 17 18:13:12.280118 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280083 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280340 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280127 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280418 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-grpc-tls\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280418 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280384 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54c06acc-691c-41df-b0a5-5e3bed98acf5-metrics-client-ca\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280527 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280480 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280585 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280525 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5zr\" (UniqueName: \"kubernetes.io/projected/54c06acc-691c-41df-b0a5-5e3bed98acf5-kube-api-access-zq5zr\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280585 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280560 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.280713 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.280694 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-tls\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.281090 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.281046 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54c06acc-691c-41df-b0a5-5e3bed98acf5-metrics-client-ca\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.283832 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.283770 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-grpc-tls\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.284222 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.284190 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.284222 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.284190 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.284565 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.284542 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.284702 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.284683 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-tls\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.284870 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.284844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/54c06acc-691c-41df-b0a5-5e3bed98acf5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.288758 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.288734 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5zr\" (UniqueName: \"kubernetes.io/projected/54c06acc-691c-41df-b0a5-5e3bed98acf5-kube-api-access-zq5zr\") pod \"thanos-querier-59d88d4c87-s2p4x\" (UID: \"54c06acc-691c-41df-b0a5-5e3bed98acf5\") " pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.400023 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.399931 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:12.572679 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:12.572653 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-59d88d4c87-s2p4x"] Apr 17 18:13:12.574565 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:13:12.574539 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c06acc_691c_41df_b0a5_5e3bed98acf5.slice/crio-8780a3610ec92a07bb0c8ecd30a3691a198d880a7822f5a9392aa1716562f4ad WatchSource:0}: Error finding container 8780a3610ec92a07bb0c8ecd30a3691a198d880a7822f5a9392aa1716562f4ad: Status 404 returned error can't find the container with id 8780a3610ec92a07bb0c8ecd30a3691a198d880a7822f5a9392aa1716562f4ad Apr 17 18:13:13.211483 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.211446 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pqdwp" event={"ID":"93258dde-b8e4-45f7-a919-f6cb6b76e9b2","Type":"ContainerStarted","Data":"28509e49ea0ecad35117af12d6a8576a98c84d814896742bbdde7ba70e291936"} Apr 17 18:13:13.211942 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.211492 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pqdwp" event={"ID":"93258dde-b8e4-45f7-a919-f6cb6b76e9b2","Type":"ContainerStarted","Data":"89931ffa1cea34a6ee125e56bb75eae4c3c2442f40e1c45cffa012e90a68fd0a"} Apr 17 18:13:13.213600 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.213565 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" event={"ID":"a5b54589-b3db-48ee-8c9e-111a4446a476","Type":"ContainerStarted","Data":"9501eef0f309cd02a52a9e5fcce9b6abece1d731d34990b1fbfb6b805ca200e2"} Apr 17 18:13:13.213737 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.213608 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" event={"ID":"a5b54589-b3db-48ee-8c9e-111a4446a476","Type":"ContainerStarted","Data":"5be48292e231c583c2a59193192286eda4416e799a78fe8ba33ec0631ad93291"} Apr 17 18:13:13.213737 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.213622 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" event={"ID":"a5b54589-b3db-48ee-8c9e-111a4446a476","Type":"ContainerStarted","Data":"a624b8dea38579f4a6d01db5c9a6238ac87e703e55ddf525847b692760582298"} Apr 17 18:13:13.214734 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.214676 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"8780a3610ec92a07bb0c8ecd30a3691a198d880a7822f5a9392aa1716562f4ad"} Apr 17 18:13:13.232694 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.232640 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pqdwp" podStartSLOduration=2.4571082 podStartE2EDuration="3.232625498s" podCreationTimestamp="2026-04-17 18:13:10 +0000 UTC" firstStartedPulling="2026-04-17 18:13:11.054248065 +0000 UTC m=+183.929640760" lastFinishedPulling="2026-04-17 18:13:11.829765356 +0000 UTC m=+184.705158058" observedRunningTime="2026-04-17 18:13:13.231405057 +0000 UTC m=+186.106797792" watchObservedRunningTime="2026-04-17 18:13:13.232625498 +0000 UTC m=+186.108018207" Apr 17 18:13:13.249824 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:13.249772 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jljrv" podStartSLOduration=1.919280939 podStartE2EDuration="3.249756453s" podCreationTimestamp="2026-04-17 18:13:10 +0000 UTC" firstStartedPulling="2026-04-17 18:13:11.159234672 +0000 UTC m=+184.034627372" lastFinishedPulling="2026-04-17 18:13:12.489710195 +0000 UTC m=+185.365102886" observedRunningTime="2026-04-17 18:13:13.248211957 +0000 UTC m=+186.123604667" watchObservedRunningTime="2026-04-17 18:13:13.249756453 +0000 UTC m=+186.125149163" Apr 17 18:13:15.222830 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:15.222796 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"79b178c49da1dd10f057c5003de347905db4ffc6ba8d72e60a89718b06b4cdcd"} Apr 17 18:13:15.222830 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:15.222834 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"9a609ee24ed4f622f14108e98c7091dc3607a82179e21c4862866d5ae560a745"} Apr 17 18:13:15.222830 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:15.222844 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"cba8850eac8b3ab6d6a0cf120e384f9c2638e32ee9c75b7f5cec8f03a50abe31"} Apr 17 18:13:16.165509 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.165465 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:16.165509 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.165518 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:16.170246 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.170223 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:16.227952 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.227909 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"1098cf61e18c769cc0fc149e5fd7856531e4c7f07f6b5de8c372a642ac2b96bc"} Apr 17 18:13:16.227952 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.227959 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"e9bf6bf7deb2a3b9193812b20f6295920f471748842297bdcf1b9b1444271137"} Apr 17 18:13:16.228433 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.227975 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" event={"ID":"54c06acc-691c-41df-b0a5-5e3bed98acf5","Type":"ContainerStarted","Data":"981c133957b6772af31bf62e8f2ff520583ef47b429a829db48386c1183dde95"} Apr 17 18:13:16.231903 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.231878 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:16.249359 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:16.249310 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" podStartSLOduration=1.412632531 podStartE2EDuration="4.249297521s" podCreationTimestamp="2026-04-17 18:13:12 +0000 UTC" firstStartedPulling="2026-04-17 18:13:12.576685679 +0000 UTC m=+185.452078368" lastFinishedPulling="2026-04-17 18:13:15.41335067 +0000 UTC m=+188.288743358" observedRunningTime="2026-04-17 18:13:16.247561513 +0000 UTC m=+189.122954224" watchObservedRunningTime="2026-04-17 18:13:16.249297521 +0000 UTC m=+189.124690222" Apr 17 18:13:17.231221 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:17.231189 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:23.240078 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:23.240049 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-59d88d4c87-s2p4x" Apr 17 18:13:28.101224 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:28.101194 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df67fc58c-smvkq"] Apr 17 18:13:50.326179 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:50.326141 2583 generic.go:358] "Generic (PLEG): container finished" podID="212d9a00-537f-41e9-b6bf-b14feb7f40a5" containerID="eb13766cd85b4aba39e53bcaf6e79408772fe9ba3e47c3d88423260633754910" exitCode=0 Apr 17 18:13:50.326630 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:50.326216 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" event={"ID":"212d9a00-537f-41e9-b6bf-b14feb7f40a5","Type":"ContainerDied","Data":"eb13766cd85b4aba39e53bcaf6e79408772fe9ba3e47c3d88423260633754910"} Apr 17 18:13:50.326630 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:50.326571 2583 scope.go:117] "RemoveContainer" containerID="eb13766cd85b4aba39e53bcaf6e79408772fe9ba3e47c3d88423260633754910" Apr 17 18:13:51.330177 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:51.330137 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z2z5x" event={"ID":"212d9a00-537f-41e9-b6bf-b14feb7f40a5","Type":"ContainerStarted","Data":"e65e75485ae394be562ff70ffe85fafcfad269d8bd5f6f9b5611848ef19ae88d"} Apr 17 18:13:53.120634 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.120594 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6df67fc58c-smvkq" podUID="8579aae1-f6b0-45e0-abdb-41973cc8a6d9" containerName="console" containerID="cri-o://d4323e3dafb6d9841ce749659f9ec4049f6958dffbd76ac958d568f48afedd69" gracePeriod=15 Apr 17 18:13:53.336397 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.336375 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df67fc58c-smvkq_8579aae1-f6b0-45e0-abdb-41973cc8a6d9/console/0.log" Apr 17 18:13:53.336508 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.336416 2583 generic.go:358] "Generic (PLEG): container finished" podID="8579aae1-f6b0-45e0-abdb-41973cc8a6d9" containerID="d4323e3dafb6d9841ce749659f9ec4049f6958dffbd76ac958d568f48afedd69" exitCode=2 Apr 17 18:13:53.336508 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.336481 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df67fc58c-smvkq" event={"ID":"8579aae1-f6b0-45e0-abdb-41973cc8a6d9","Type":"ContainerDied","Data":"d4323e3dafb6d9841ce749659f9ec4049f6958dffbd76ac958d568f48afedd69"} Apr 17 18:13:53.349859 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.349838 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df67fc58c-smvkq_8579aae1-f6b0-45e0-abdb-41973cc8a6d9/console/0.log" Apr 17 18:13:53.349969 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.349906 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:53.433311 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433206 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-oauth-serving-cert\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433311 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433256 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-trusted-ca-bundle\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433311 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433299 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-serving-cert\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433580 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433317 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-oauth-config\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433580 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433354 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z97jm\" (UniqueName: \"kubernetes.io/projected/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-kube-api-access-z97jm\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433580 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433393 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-config\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433580 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433491 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-service-ca\") pod \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\" (UID: \"8579aae1-f6b0-45e0-abdb-41973cc8a6d9\") " Apr 17 18:13:53.433757 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433733 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:53.433869 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433841 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:53.433869 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433860 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-config" (OuterVolumeSpecName: "console-config") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:53.434061 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.433956 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-service-ca" (OuterVolumeSpecName: "service-ca") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:53.435930 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.435900 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-kube-api-access-z97jm" (OuterVolumeSpecName: "kube-api-access-z97jm") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "kube-api-access-z97jm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:13:53.435930 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.435924 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:13:53.436044 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.435944 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8579aae1-f6b0-45e0-abdb-41973cc8a6d9" (UID: "8579aae1-f6b0-45e0-abdb-41973cc8a6d9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:13:53.534654 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534608 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-oauth-serving-cert\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:53.534654 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534650 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-trusted-ca-bundle\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:53.534654 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534661 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-serving-cert\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:53.534654 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534670 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-oauth-config\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:53.534907 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534680 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z97jm\" (UniqueName: \"kubernetes.io/projected/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-kube-api-access-z97jm\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:53.534907 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534690 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-console-config\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:53.534907 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:53.534699 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8579aae1-f6b0-45e0-abdb-41973cc8a6d9-service-ca\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:13:54.340267 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:54.340237 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6df67fc58c-smvkq_8579aae1-f6b0-45e0-abdb-41973cc8a6d9/console/0.log" Apr 17 18:13:54.340700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:54.340324 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6df67fc58c-smvkq" event={"ID":"8579aae1-f6b0-45e0-abdb-41973cc8a6d9","Type":"ContainerDied","Data":"5a0b8fa99b2166f1f0755dc1b8e0da0ae3aea430520030fe8c52d0b3f00c8155"} Apr 17 18:13:54.340700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:54.340366 2583 scope.go:117] "RemoveContainer" containerID="d4323e3dafb6d9841ce749659f9ec4049f6958dffbd76ac958d568f48afedd69" Apr 17 18:13:54.340700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:54.340369 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6df67fc58c-smvkq" Apr 17 18:13:54.358668 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:54.358635 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6df67fc58c-smvkq"] Apr 17 18:13:54.365396 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:54.365372 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6df67fc58c-smvkq"] Apr 17 18:13:55.344604 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:55.344568 2583 generic.go:358] "Generic (PLEG): container finished" podID="a013aac6-7414-4d90-8f7d-99be6094d204" containerID="39497d3ebb3c1fabd34bb74d51307a57f6d56fa8b3905c6196afa7e532296f7d" exitCode=0 Apr 17 18:13:55.345058 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:55.344645 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" event={"ID":"a013aac6-7414-4d90-8f7d-99be6094d204","Type":"ContainerDied","Data":"39497d3ebb3c1fabd34bb74d51307a57f6d56fa8b3905c6196afa7e532296f7d"} Apr 17 18:13:55.345058 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:55.345016 2583 scope.go:117] "RemoveContainer" containerID="39497d3ebb3c1fabd34bb74d51307a57f6d56fa8b3905c6196afa7e532296f7d" Apr 17 18:13:55.707595 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:55.707520 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8579aae1-f6b0-45e0-abdb-41973cc8a6d9" path="/var/lib/kubelet/pods/8579aae1-f6b0-45e0-abdb-41973cc8a6d9/volumes" Apr 17 18:13:56.350019 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:13:56.349977 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mmfmz" event={"ID":"a013aac6-7414-4d90-8f7d-99be6094d204","Type":"ContainerStarted","Data":"7c86423c8f23537e4b0e375041ea01a8069fcdd6fe0b0aad825799ee09431256"} Apr 17 18:14:19.543909 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:19.543860 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:14:19.546442 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:19.546420 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ea7fcad-19ae-42ab-8026-113afe4c2f23-metrics-certs\") pod \"network-metrics-daemon-6d44x\" (UID: \"2ea7fcad-19ae-42ab-8026-113afe4c2f23\") " pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:14:19.807031 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:19.806998 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-c85k4\"" Apr 17 18:14:19.814958 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:19.814938 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6d44x" Apr 17 18:14:19.937037 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:19.936938 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6d44x"] Apr 17 18:14:19.939579 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:14:19.939536 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea7fcad_19ae_42ab_8026_113afe4c2f23.slice/crio-33a393cca96b7db923fac95c2c8c083d1694dd0115cd63218e14d87fb8578534 WatchSource:0}: Error finding container 33a393cca96b7db923fac95c2c8c083d1694dd0115cd63218e14d87fb8578534: Status 404 returned error can't find the container with id 33a393cca96b7db923fac95c2c8c083d1694dd0115cd63218e14d87fb8578534 Apr 17 18:14:20.417392 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:20.417350 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6d44x" event={"ID":"2ea7fcad-19ae-42ab-8026-113afe4c2f23","Type":"ContainerStarted","Data":"33a393cca96b7db923fac95c2c8c083d1694dd0115cd63218e14d87fb8578534"} Apr 17 18:14:21.422049 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:21.422010 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6d44x" event={"ID":"2ea7fcad-19ae-42ab-8026-113afe4c2f23","Type":"ContainerStarted","Data":"14cf26ed5032078a3a9815f868c0451eb571102e76d7a21b9d5339e06e7b9405"} Apr 17 18:14:21.422049 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:21.422052 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6d44x" event={"ID":"2ea7fcad-19ae-42ab-8026-113afe4c2f23","Type":"ContainerStarted","Data":"3296471e1338cfa052063dcb9ce092d8ca325ce68214e9d1b8f968c0441559e7"} Apr 17 18:14:21.438339 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:21.438289 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6d44x" podStartSLOduration=253.472704861 podStartE2EDuration="4m14.438258456s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:14:19.941503676 +0000 UTC m=+252.816896372" lastFinishedPulling="2026-04-17 18:14:20.907057279 +0000 UTC m=+253.782449967" observedRunningTime="2026-04-17 18:14:21.437012477 +0000 UTC m=+254.312405184" watchObservedRunningTime="2026-04-17 18:14:21.438258456 +0000 UTC m=+254.313651349" Apr 17 18:14:34.626956 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.626872 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-68989df595-zklhs"] Apr 17 18:14:34.627513 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.627207 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8579aae1-f6b0-45e0-abdb-41973cc8a6d9" containerName="console" Apr 17 18:14:34.627513 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.627222 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8579aae1-f6b0-45e0-abdb-41973cc8a6d9" containerName="console" Apr 17 18:14:34.627513 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.627304 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8579aae1-f6b0-45e0-abdb-41973cc8a6d9" containerName="console" Apr 17 18:14:34.631611 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.631592 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.634263 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.634244 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jjvwm\"" Apr 17 18:14:34.634472 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.634454 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 18:14:34.634896 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.634875 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 18:14:34.635001 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.634958 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 18:14:34.635086 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.635072 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 18:14:34.635179 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.635161 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 18:14:34.640443 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.640424 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 18:14:34.644915 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.644896 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68989df595-zklhs"] Apr 17 18:14:34.765146 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765104 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765157 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765183 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-secret-telemeter-client\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765231 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-telemeter-client-tls\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765305 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sl4z\" (UniqueName: \"kubernetes.io/projected/b2a09275-636d-44a9-8780-410b1d31715f-kube-api-access-8sl4z\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765497 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765354 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-federate-client-tls\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765497 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765376 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-serving-certs-ca-bundle\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.765497 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.765401 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-metrics-client-ca\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866032 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.865997 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866032 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866042 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-secret-telemeter-client\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866067 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-telemeter-client-tls\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866085 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sl4z\" (UniqueName: \"kubernetes.io/projected/b2a09275-636d-44a9-8780-410b1d31715f-kube-api-access-8sl4z\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866115 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-federate-client-tls\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866140 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-serving-certs-ca-bundle\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866158 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-metrics-client-ca\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.866296 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.866224 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.867125 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.867095 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-serving-certs-ca-bundle\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.867400 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.867366 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.867888 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.867861 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2a09275-636d-44a9-8780-410b1d31715f-metrics-client-ca\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.868709 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.868683 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.868859 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.868841 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-secret-telemeter-client\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.868943 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.868925 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-telemeter-client-tls\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.868995 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.868938 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b2a09275-636d-44a9-8780-410b1d31715f-federate-client-tls\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.878595 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.878539 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sl4z\" (UniqueName: \"kubernetes.io/projected/b2a09275-636d-44a9-8780-410b1d31715f-kube-api-access-8sl4z\") pod \"telemeter-client-68989df595-zklhs\" (UID: \"b2a09275-636d-44a9-8780-410b1d31715f\") " pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:34.942357 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:34.942320 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" Apr 17 18:14:35.086588 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:35.086556 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68989df595-zklhs"] Apr 17 18:14:35.088093 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:14:35.088068 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a09275_636d_44a9_8780_410b1d31715f.slice/crio-136b6540366c7707d3b7384799c945eb79e6432f3d4beac2d641aaf54936885a WatchSource:0}: Error finding container 136b6540366c7707d3b7384799c945eb79e6432f3d4beac2d641aaf54936885a: Status 404 returned error can't find the container with id 136b6540366c7707d3b7384799c945eb79e6432f3d4beac2d641aaf54936885a Apr 17 18:14:35.464340 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:35.464296 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" event={"ID":"b2a09275-636d-44a9-8780-410b1d31715f","Type":"ContainerStarted","Data":"136b6540366c7707d3b7384799c945eb79e6432f3d4beac2d641aaf54936885a"} Apr 17 18:14:37.472817 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:37.472773 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" event={"ID":"b2a09275-636d-44a9-8780-410b1d31715f","Type":"ContainerStarted","Data":"30cf79fac94cd5cefa73294520d73e8466858716048281b1d7269f158fd287f2"} Apr 17 18:14:38.477602 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:38.477569 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" event={"ID":"b2a09275-636d-44a9-8780-410b1d31715f","Type":"ContainerStarted","Data":"bd0d71279d32158d7b0eb8785549862940bc21fc1c2c2b4dddaab2d8bd0bb00f"} Apr 17 18:14:38.478037 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:38.477611 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" event={"ID":"b2a09275-636d-44a9-8780-410b1d31715f","Type":"ContainerStarted","Data":"806e2891604c947b90a692e4cbb99969ee5ecff19144e3370a24bbe8ea504a57"} Apr 17 18:14:38.522888 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:38.522827 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-68989df595-zklhs" podStartSLOduration=1.6721281970000001 podStartE2EDuration="4.522807502s" podCreationTimestamp="2026-04-17 18:14:34 +0000 UTC" firstStartedPulling="2026-04-17 18:14:35.090030509 +0000 UTC m=+267.965423196" lastFinishedPulling="2026-04-17 18:14:37.940709802 +0000 UTC m=+270.816102501" observedRunningTime="2026-04-17 18:14:38.521518253 +0000 UTC m=+271.396910963" watchObservedRunningTime="2026-04-17 18:14:38.522807502 +0000 UTC m=+271.398200213" Apr 17 18:14:39.289336 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.289303 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7679fdc464-rqlf6"] Apr 17 18:14:39.292679 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.292664 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.299910 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.299886 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 18:14:39.300079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.299895 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lwnmm\"" Apr 17 18:14:39.300160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.299964 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 18:14:39.300160 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.299983 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 18:14:39.300263 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.300005 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 18:14:39.300263 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.300021 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 18:14:39.301408 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.301378 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 18:14:39.301507 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.301417 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 18:14:39.307695 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.307676 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 18:14:39.333504 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.333476 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7679fdc464-rqlf6"] Apr 17 18:14:39.405440 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405410 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bcc4\" (UniqueName: \"kubernetes.io/projected/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-kube-api-access-2bcc4\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.405610 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405452 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-serving-cert\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.405610 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405513 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-config\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.405610 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405559 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-service-ca\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.405610 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405590 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-trusted-ca-bundle\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.405769 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405627 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-oauth-config\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.405769 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.405651 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-oauth-serving-cert\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506154 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506101 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-config\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506154 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506164 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-service-ca\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-trusted-ca-bundle\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506206 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-oauth-config\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506234 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-oauth-serving-cert\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506261 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bcc4\" (UniqueName: \"kubernetes.io/projected/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-kube-api-access-2bcc4\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.506655 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.506316 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-serving-cert\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.507582 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.507558 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-config\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.507754 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.507719 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-service-ca\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.507946 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.507795 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-oauth-serving-cert\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.507946 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.507804 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-trusted-ca-bundle\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.508755 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.508734 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-oauth-config\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.509062 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.509044 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-serving-cert\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.520247 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.520220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bcc4\" (UniqueName: \"kubernetes.io/projected/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-kube-api-access-2bcc4\") pod \"console-7679fdc464-rqlf6\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.602385 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.602349 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:39.754562 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:39.754540 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7679fdc464-rqlf6"] Apr 17 18:14:39.756596 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:14:39.756569 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod147d1a4a_0a80_4f36_a2f6_bb146eaf3fe3.slice/crio-d3607d3ac3bc4037349fbc5e90acde9bfb1b0651ab58791b151a13b93d097d35 WatchSource:0}: Error finding container d3607d3ac3bc4037349fbc5e90acde9bfb1b0651ab58791b151a13b93d097d35: Status 404 returned error can't find the container with id d3607d3ac3bc4037349fbc5e90acde9bfb1b0651ab58791b151a13b93d097d35 Apr 17 18:14:40.484484 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:40.484443 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7679fdc464-rqlf6" event={"ID":"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3","Type":"ContainerStarted","Data":"80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1"} Apr 17 18:14:40.484484 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:40.484480 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7679fdc464-rqlf6" event={"ID":"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3","Type":"ContainerStarted","Data":"d3607d3ac3bc4037349fbc5e90acde9bfb1b0651ab58791b151a13b93d097d35"} Apr 17 18:14:49.603402 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:49.603355 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:49.603402 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:49.603411 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:49.608150 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:49.608119 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:14:49.636835 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:49.636782 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7679fdc464-rqlf6" podStartSLOduration=10.6367681 podStartE2EDuration="10.6367681s" podCreationTimestamp="2026-04-17 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:14:40.525342069 +0000 UTC m=+273.400734809" watchObservedRunningTime="2026-04-17 18:14:49.6367681 +0000 UTC m=+282.512160810" Apr 17 18:14:50.519904 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:14:50.519874 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:15:07.590462 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:07.590430 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 18:15:47.397954 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.397918 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-756d58777d-vdlcn"] Apr 17 18:15:47.401031 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.401013 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.419079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.419048 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756d58777d-vdlcn"] Apr 17 18:15:47.568081 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568040 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-service-ca\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.568314 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568090 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-oauth-serving-cert\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.568314 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568155 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtjm\" (UniqueName: \"kubernetes.io/projected/ef4c3052-8bf0-40c7-b362-fea4c3063c25-kube-api-access-6dtjm\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.568314 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568215 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-config\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.568314 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-oauth-config\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.568510 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568356 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-serving-cert\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.568510 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.568392 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-trusted-ca-bundle\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.668901 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.668811 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-serving-cert\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.668901 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.668858 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-trusted-ca-bundle\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669097 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.668986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-service-ca\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669097 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669043 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-oauth-serving-cert\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669097 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669076 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtjm\" (UniqueName: \"kubernetes.io/projected/ef4c3052-8bf0-40c7-b362-fea4c3063c25-kube-api-access-6dtjm\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669244 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669111 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-config\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669244 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669147 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-oauth-config\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669940 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669872 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-trusted-ca-bundle\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669940 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669886 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-service-ca\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669940 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669886 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-config\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.669940 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.669942 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-oauth-serving-cert\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.671652 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.671631 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-serving-cert\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.671801 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.671776 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-oauth-config\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.679683 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.679660 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtjm\" (UniqueName: \"kubernetes.io/projected/ef4c3052-8bf0-40c7-b362-fea4c3063c25-kube-api-access-6dtjm\") pod \"console-756d58777d-vdlcn\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.709524 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.709495 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:47.834915 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.834885 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756d58777d-vdlcn"] Apr 17 18:15:47.838520 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:15:47.838489 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4c3052_8bf0_40c7_b362_fea4c3063c25.slice/crio-83e45d9cfa26c9075c706faa05841eb349d90af51bee8fdee546cd0578589ca5 WatchSource:0}: Error finding container 83e45d9cfa26c9075c706faa05841eb349d90af51bee8fdee546cd0578589ca5: Status 404 returned error can't find the container with id 83e45d9cfa26c9075c706faa05841eb349d90af51bee8fdee546cd0578589ca5 Apr 17 18:15:47.840348 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:47.840323 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:15:48.678866 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:48.678821 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756d58777d-vdlcn" event={"ID":"ef4c3052-8bf0-40c7-b362-fea4c3063c25","Type":"ContainerStarted","Data":"db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a"} Apr 17 18:15:48.679224 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:48.678873 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756d58777d-vdlcn" event={"ID":"ef4c3052-8bf0-40c7-b362-fea4c3063c25","Type":"ContainerStarted","Data":"83e45d9cfa26c9075c706faa05841eb349d90af51bee8fdee546cd0578589ca5"} Apr 17 18:15:48.702805 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:48.702750 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-756d58777d-vdlcn" podStartSLOduration=1.7027361 podStartE2EDuration="1.7027361s" podCreationTimestamp="2026-04-17 18:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:15:48.701525078 +0000 UTC m=+341.576917803" watchObservedRunningTime="2026-04-17 18:15:48.7027361 +0000 UTC m=+341.578128810" Apr 17 18:15:57.709675 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:57.709595 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:57.709675 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:57.709636 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:57.714166 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:57.714144 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:58.714898 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:58.714866 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:15:58.781300 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:15:58.779080 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7679fdc464-rqlf6"] Apr 17 18:16:23.803738 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:23.803672 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7679fdc464-rqlf6" podUID="147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" containerName="console" containerID="cri-o://80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1" gracePeriod=15 Apr 17 18:16:24.039286 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.039258 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7679fdc464-rqlf6_147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3/console/0.log" Apr 17 18:16:24.039426 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.039348 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:16:24.185437 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185347 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bcc4\" (UniqueName: \"kubernetes.io/projected/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-kube-api-access-2bcc4\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185437 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185389 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-config\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185663 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185474 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-oauth-serving-cert\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185663 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185507 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-trusted-ca-bundle\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185663 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185535 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-serving-cert\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185663 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185562 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-oauth-config\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185863 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185679 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-service-ca\") pod \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\" (UID: \"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3\") " Apr 17 18:16:24.185973 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185877 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-config" (OuterVolumeSpecName: "console-config") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:16:24.185973 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185931 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:16:24.186066 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185984 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-config\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.186066 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.185987 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:16:24.186457 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.186387 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-service-ca" (OuterVolumeSpecName: "service-ca") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:16:24.187867 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.187847 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:16:24.188262 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.188243 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:16:24.188338 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.188243 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-kube-api-access-2bcc4" (OuterVolumeSpecName: "kube-api-access-2bcc4") pod "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" (UID: "147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3"). InnerVolumeSpecName "kube-api-access-2bcc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:16:24.287257 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.287210 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-oauth-serving-cert\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.287257 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.287253 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-trusted-ca-bundle\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.287257 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.287263 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-serving-cert\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.287507 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.287295 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-console-oauth-config\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.287507 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.287306 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-service-ca\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.287507 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.287315 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2bcc4\" (UniqueName: \"kubernetes.io/projected/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3-kube-api-access-2bcc4\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:24.779884 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.779856 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7679fdc464-rqlf6_147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3/console/0.log" Apr 17 18:16:24.780071 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.779899 2583 generic.go:358] "Generic (PLEG): container finished" podID="147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" containerID="80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1" exitCode=2 Apr 17 18:16:24.780071 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.779931 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7679fdc464-rqlf6" event={"ID":"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3","Type":"ContainerDied","Data":"80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1"} Apr 17 18:16:24.780071 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.779958 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7679fdc464-rqlf6" event={"ID":"147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3","Type":"ContainerDied","Data":"d3607d3ac3bc4037349fbc5e90acde9bfb1b0651ab58791b151a13b93d097d35"} Apr 17 18:16:24.780071 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.779975 2583 scope.go:117] "RemoveContainer" containerID="80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1" Apr 17 18:16:24.780071 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.779974 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7679fdc464-rqlf6" Apr 17 18:16:24.788950 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.788933 2583 scope.go:117] "RemoveContainer" containerID="80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1" Apr 17 18:16:24.789248 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:16:24.789224 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1\": container with ID starting with 80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1 not found: ID does not exist" containerID="80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1" Apr 17 18:16:24.789316 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.789257 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1"} err="failed to get container status \"80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1\": rpc error: code = NotFound desc = could not find container \"80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1\": container with ID starting with 80dfef8766f138c5661298b6c5144264eb1c29e2b979d5399be5c7f279b8f0f1 not found: ID does not exist" Apr 17 18:16:24.803735 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.803708 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7679fdc464-rqlf6"] Apr 17 18:16:24.808111 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:24.808089 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7679fdc464-rqlf6"] Apr 17 18:16:25.707402 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:25.707370 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" path="/var/lib/kubelet/pods/147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3/volumes" Apr 17 18:16:36.586797 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.586761 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j"] Apr 17 18:16:36.587170 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.587071 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" containerName="console" Apr 17 18:16:36.587170 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.587082 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" containerName="console" Apr 17 18:16:36.587170 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.587143 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="147d1a4a-0a80-4f36-a2f6-bb146eaf3fe3" containerName="console" Apr 17 18:16:36.591468 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.591450 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.596137 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.596118 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:16:36.599076 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.599057 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:16:36.599181 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.599104 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-22fc9\"" Apr 17 18:16:36.607023 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.606986 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j"] Apr 17 18:16:36.687469 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.687428 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.687642 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.687479 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.687642 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.687569 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxkmm\" (UniqueName: \"kubernetes.io/projected/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-kube-api-access-cxkmm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.788284 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.788244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.788430 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.788342 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxkmm\" (UniqueName: \"kubernetes.io/projected/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-kube-api-access-cxkmm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.788430 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.788397 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.788635 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.788616 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.788670 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.788656 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.798756 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.798720 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxkmm\" (UniqueName: \"kubernetes.io/projected/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-kube-api-access-cxkmm\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:36.901089 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:36.900988 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:37.030543 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:37.030511 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j"] Apr 17 18:16:37.034491 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:16:37.034461 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod159b014b_1d4a_4cf0_b6ac_3da3d908b55a.slice/crio-fcd0adbf0bacebe0d2d6022024fd1948c01eac63a4a90ca76d771b99191aedf3 WatchSource:0}: Error finding container fcd0adbf0bacebe0d2d6022024fd1948c01eac63a4a90ca76d771b99191aedf3: Status 404 returned error can't find the container with id fcd0adbf0bacebe0d2d6022024fd1948c01eac63a4a90ca76d771b99191aedf3 Apr 17 18:16:37.819808 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:37.819765 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" event={"ID":"159b014b-1d4a-4cf0-b6ac-3da3d908b55a","Type":"ContainerStarted","Data":"fcd0adbf0bacebe0d2d6022024fd1948c01eac63a4a90ca76d771b99191aedf3"} Apr 17 18:16:42.837877 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:42.837837 2583 generic.go:358] "Generic (PLEG): container finished" podID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerID="f28cb6771e294b738b9ea8ea787fa25f054d936ce5d9355c6d41368108072efe" exitCode=0 Apr 17 18:16:42.838340 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:42.837927 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" event={"ID":"159b014b-1d4a-4cf0-b6ac-3da3d908b55a","Type":"ContainerDied","Data":"f28cb6771e294b738b9ea8ea787fa25f054d936ce5d9355c6d41368108072efe"} Apr 17 18:16:45.849079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:45.849045 2583 generic.go:358] "Generic (PLEG): container finished" podID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerID="b1427c0bff748fbb220f58665323d953fa9cec13c218cdabb6c05dced4caae65" exitCode=0 Apr 17 18:16:45.849473 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:45.849130 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" event={"ID":"159b014b-1d4a-4cf0-b6ac-3da3d908b55a","Type":"ContainerDied","Data":"b1427c0bff748fbb220f58665323d953fa9cec13c218cdabb6c05dced4caae65"} Apr 17 18:16:53.879448 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:53.879409 2583 generic.go:358] "Generic (PLEG): container finished" podID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerID="30d63c8165e5c035933cfe02d3ddbd1bab41e28e926a0063b68d2cef1dc7557d" exitCode=0 Apr 17 18:16:53.879835 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:53.879455 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" event={"ID":"159b014b-1d4a-4cf0-b6ac-3da3d908b55a","Type":"ContainerDied","Data":"30d63c8165e5c035933cfe02d3ddbd1bab41e28e926a0063b68d2cef1dc7557d"} Apr 17 18:16:55.001970 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.001944 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:55.033542 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.033512 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-util\") pod \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " Apr 17 18:16:55.033694 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.033553 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxkmm\" (UniqueName: \"kubernetes.io/projected/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-kube-api-access-cxkmm\") pod \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " Apr 17 18:16:55.033694 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.033621 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-bundle\") pod \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\" (UID: \"159b014b-1d4a-4cf0-b6ac-3da3d908b55a\") " Apr 17 18:16:55.034358 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.034324 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-bundle" (OuterVolumeSpecName: "bundle") pod "159b014b-1d4a-4cf0-b6ac-3da3d908b55a" (UID: "159b014b-1d4a-4cf0-b6ac-3da3d908b55a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:16:55.036180 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.036154 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-kube-api-access-cxkmm" (OuterVolumeSpecName: "kube-api-access-cxkmm") pod "159b014b-1d4a-4cf0-b6ac-3da3d908b55a" (UID: "159b014b-1d4a-4cf0-b6ac-3da3d908b55a"). InnerVolumeSpecName "kube-api-access-cxkmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:16:55.038150 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.038130 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-util" (OuterVolumeSpecName: "util") pod "159b014b-1d4a-4cf0-b6ac-3da3d908b55a" (UID: "159b014b-1d4a-4cf0-b6ac-3da3d908b55a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:16:55.134870 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.134767 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-util\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:55.134870 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.134814 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxkmm\" (UniqueName: \"kubernetes.io/projected/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-kube-api-access-cxkmm\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:55.134870 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.134829 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/159b014b-1d4a-4cf0-b6ac-3da3d908b55a-bundle\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:16:55.886637 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.886600 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" event={"ID":"159b014b-1d4a-4cf0-b6ac-3da3d908b55a","Type":"ContainerDied","Data":"fcd0adbf0bacebe0d2d6022024fd1948c01eac63a4a90ca76d771b99191aedf3"} Apr 17 18:16:55.886637 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.886638 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcd0adbf0bacebe0d2d6022024fd1948c01eac63a4a90ca76d771b99191aedf3" Apr 17 18:16:55.886876 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:55.886610 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5csq4j" Apr 17 18:16:59.497450 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497419 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz"] Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497708 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="util" Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497719 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="util" Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497740 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="pull" Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497746 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="pull" Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497751 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="extract" Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497757 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="extract" Apr 17 18:16:59.497816 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.497802 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="159b014b-1d4a-4cf0-b6ac-3da3d908b55a" containerName="extract" Apr 17 18:16:59.540690 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.540647 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz"] Apr 17 18:16:59.540846 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.540772 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.544836 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.544805 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 18:16:59.545141 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.545120 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-vkktv\"" Apr 17 18:16:59.545205 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.545167 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:16:59.571568 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.571520 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68npw\" (UniqueName: \"kubernetes.io/projected/da4ba271-6436-492b-8cb9-897ab59e8615-kube-api-access-68npw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-fqrpz\" (UID: \"da4ba271-6436-492b-8cb9-897ab59e8615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.571568 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.571562 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/da4ba271-6436-492b-8cb9-897ab59e8615-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-fqrpz\" (UID: \"da4ba271-6436-492b-8cb9-897ab59e8615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.672233 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.672195 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68npw\" (UniqueName: \"kubernetes.io/projected/da4ba271-6436-492b-8cb9-897ab59e8615-kube-api-access-68npw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-fqrpz\" (UID: \"da4ba271-6436-492b-8cb9-897ab59e8615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.672233 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.672237 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/da4ba271-6436-492b-8cb9-897ab59e8615-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-fqrpz\" (UID: \"da4ba271-6436-492b-8cb9-897ab59e8615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.672651 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.672634 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/da4ba271-6436-492b-8cb9-897ab59e8615-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-fqrpz\" (UID: \"da4ba271-6436-492b-8cb9-897ab59e8615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.682265 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.682242 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68npw\" (UniqueName: \"kubernetes.io/projected/da4ba271-6436-492b-8cb9-897ab59e8615-kube-api-access-68npw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-fqrpz\" (UID: \"da4ba271-6436-492b-8cb9-897ab59e8615\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.850107 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.850068 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" Apr 17 18:16:59.982996 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:16:59.982957 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz"] Apr 17 18:16:59.984867 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:16:59.984838 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda4ba271_6436_492b_8cb9_897ab59e8615.slice/crio-b24e753c67952471d6348dc98f85cb7c8b161d8b110ee075f0e692c43cdac483 WatchSource:0}: Error finding container b24e753c67952471d6348dc98f85cb7c8b161d8b110ee075f0e692c43cdac483: Status 404 returned error can't find the container with id b24e753c67952471d6348dc98f85cb7c8b161d8b110ee075f0e692c43cdac483 Apr 17 18:17:00.907461 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:00.907412 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" event={"ID":"da4ba271-6436-492b-8cb9-897ab59e8615","Type":"ContainerStarted","Data":"b24e753c67952471d6348dc98f85cb7c8b161d8b110ee075f0e692c43cdac483"} Apr 17 18:17:02.916346 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:02.916305 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" event={"ID":"da4ba271-6436-492b-8cb9-897ab59e8615","Type":"ContainerStarted","Data":"aa3c6eda0542fc409bc3631604bb6cfd077fcabfd313bcd23ce800041c350c3f"} Apr 17 18:17:03.956346 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:03.956262 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-fqrpz" podStartSLOduration=2.145727852 podStartE2EDuration="4.956241864s" podCreationTimestamp="2026-04-17 18:16:59 +0000 UTC" firstStartedPulling="2026-04-17 18:16:59.987540445 +0000 UTC m=+412.862933133" lastFinishedPulling="2026-04-17 18:17:02.798054445 +0000 UTC m=+415.673447145" observedRunningTime="2026-04-17 18:17:03.955010156 +0000 UTC m=+416.830402859" watchObservedRunningTime="2026-04-17 18:17:03.956241864 +0000 UTC m=+416.831634575" Apr 17 18:17:09.427066 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.427033 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-98sjx"] Apr 17 18:17:09.430076 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.430060 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.434371 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.434354 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 18:17:09.435337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.435317 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-vc4zt\"" Apr 17 18:17:09.435454 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.435335 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 18:17:09.448620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.448591 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-98sjx"] Apr 17 18:17:09.551834 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.551795 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6wt\" (UniqueName: \"kubernetes.io/projected/83bcdef2-473d-491f-abe2-f8685c5770a0-kube-api-access-tx6wt\") pod \"cert-manager-cainjector-8966b78d4-98sjx\" (UID: \"83bcdef2-473d-491f-abe2-f8685c5770a0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.552002 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.551852 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83bcdef2-473d-491f-abe2-f8685c5770a0-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-98sjx\" (UID: \"83bcdef2-473d-491f-abe2-f8685c5770a0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.652960 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.652923 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83bcdef2-473d-491f-abe2-f8685c5770a0-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-98sjx\" (UID: \"83bcdef2-473d-491f-abe2-f8685c5770a0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.653100 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.653014 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6wt\" (UniqueName: \"kubernetes.io/projected/83bcdef2-473d-491f-abe2-f8685c5770a0-kube-api-access-tx6wt\") pod \"cert-manager-cainjector-8966b78d4-98sjx\" (UID: \"83bcdef2-473d-491f-abe2-f8685c5770a0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.663009 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.662971 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/83bcdef2-473d-491f-abe2-f8685c5770a0-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-98sjx\" (UID: \"83bcdef2-473d-491f-abe2-f8685c5770a0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.663114 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.663024 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6wt\" (UniqueName: \"kubernetes.io/projected/83bcdef2-473d-491f-abe2-f8685c5770a0-kube-api-access-tx6wt\") pod \"cert-manager-cainjector-8966b78d4-98sjx\" (UID: \"83bcdef2-473d-491f-abe2-f8685c5770a0\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.751039 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.750952 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" Apr 17 18:17:09.875409 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.875381 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-98sjx"] Apr 17 18:17:09.877591 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:17:09.877563 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83bcdef2_473d_491f_abe2_f8685c5770a0.slice/crio-50d035f6ac5373bf482c9bec8b49f77d2cb87b1883c45da1625bebedfdb8ade3 WatchSource:0}: Error finding container 50d035f6ac5373bf482c9bec8b49f77d2cb87b1883c45da1625bebedfdb8ade3: Status 404 returned error can't find the container with id 50d035f6ac5373bf482c9bec8b49f77d2cb87b1883c45da1625bebedfdb8ade3 Apr 17 18:17:09.940578 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:09.940536 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" event={"ID":"83bcdef2-473d-491f-abe2-f8685c5770a0","Type":"ContainerStarted","Data":"50d035f6ac5373bf482c9bec8b49f77d2cb87b1883c45da1625bebedfdb8ade3"} Apr 17 18:17:12.951592 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:12.951548 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" event={"ID":"83bcdef2-473d-491f-abe2-f8685c5770a0","Type":"ContainerStarted","Data":"44229ae89b66add6b24af4fb9ba4054b5d87c981f71b9ad5cff0b94a74d7c0b8"} Apr 17 18:17:12.968639 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:12.968581 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-98sjx" podStartSLOduration=1.071512137 podStartE2EDuration="3.968566771s" podCreationTimestamp="2026-04-17 18:17:09 +0000 UTC" firstStartedPulling="2026-04-17 18:17:09.87945319 +0000 UTC m=+422.754845879" lastFinishedPulling="2026-04-17 18:17:12.776507821 +0000 UTC m=+425.651900513" observedRunningTime="2026-04-17 18:17:12.967667197 +0000 UTC m=+425.843059907" watchObservedRunningTime="2026-04-17 18:17:12.968566771 +0000 UTC m=+425.843959480" Apr 17 18:17:24.539836 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.539750 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-7m8rc"] Apr 17 18:17:24.541983 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.541964 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.544439 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.544417 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-zt6vl\"" Apr 17 18:17:24.552104 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.552082 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-7m8rc"] Apr 17 18:17:24.676892 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.676833 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsd7\" (UniqueName: \"kubernetes.io/projected/d8338079-a6b6-4254-b6a9-c4c4acb04066-kube-api-access-rtsd7\") pod \"cert-manager-759f64656b-7m8rc\" (UID: \"d8338079-a6b6-4254-b6a9-c4c4acb04066\") " pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.677079 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.676971 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8338079-a6b6-4254-b6a9-c4c4acb04066-bound-sa-token\") pod \"cert-manager-759f64656b-7m8rc\" (UID: \"d8338079-a6b6-4254-b6a9-c4c4acb04066\") " pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.777914 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.777871 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8338079-a6b6-4254-b6a9-c4c4acb04066-bound-sa-token\") pod \"cert-manager-759f64656b-7m8rc\" (UID: \"d8338079-a6b6-4254-b6a9-c4c4acb04066\") " pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.778093 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.777941 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsd7\" (UniqueName: \"kubernetes.io/projected/d8338079-a6b6-4254-b6a9-c4c4acb04066-kube-api-access-rtsd7\") pod \"cert-manager-759f64656b-7m8rc\" (UID: \"d8338079-a6b6-4254-b6a9-c4c4acb04066\") " pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.786867 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.786839 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8338079-a6b6-4254-b6a9-c4c4acb04066-bound-sa-token\") pod \"cert-manager-759f64656b-7m8rc\" (UID: \"d8338079-a6b6-4254-b6a9-c4c4acb04066\") " pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.787016 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.786888 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsd7\" (UniqueName: \"kubernetes.io/projected/d8338079-a6b6-4254-b6a9-c4c4acb04066-kube-api-access-rtsd7\") pod \"cert-manager-759f64656b-7m8rc\" (UID: \"d8338079-a6b6-4254-b6a9-c4c4acb04066\") " pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.851371 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.851335 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-7m8rc" Apr 17 18:17:24.984767 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.984713 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-7m8rc"] Apr 17 18:17:24.986918 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:17:24.986879 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8338079_a6b6_4254_b6a9_c4c4acb04066.slice/crio-723edbec08c06a0b391a25acac7118c1670e29b63aeb10f3fb53964f4450dd17 WatchSource:0}: Error finding container 723edbec08c06a0b391a25acac7118c1670e29b63aeb10f3fb53964f4450dd17: Status 404 returned error can't find the container with id 723edbec08c06a0b391a25acac7118c1670e29b63aeb10f3fb53964f4450dd17 Apr 17 18:17:24.995049 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:24.994955 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-7m8rc" event={"ID":"d8338079-a6b6-4254-b6a9-c4c4acb04066","Type":"ContainerStarted","Data":"723edbec08c06a0b391a25acac7118c1670e29b63aeb10f3fb53964f4450dd17"} Apr 17 18:17:25.999708 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:25.999669 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-7m8rc" event={"ID":"d8338079-a6b6-4254-b6a9-c4c4acb04066","Type":"ContainerStarted","Data":"14aad6f98fd6359cc14ef7f6943e13ef689a71326dbd93253b1efb5fc3e93049"} Apr 17 18:17:26.016714 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.016648 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-7m8rc" podStartSLOduration=2.016632727 podStartE2EDuration="2.016632727s" podCreationTimestamp="2026-04-17 18:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:17:26.016332817 +0000 UTC m=+438.891725529" watchObservedRunningTime="2026-04-17 18:17:26.016632727 +0000 UTC m=+438.892025437" Apr 17 18:17:26.394551 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.394517 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk"] Apr 17 18:17:26.396879 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.396864 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.399723 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.399694 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 18:17:26.399852 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.399696 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 18:17:26.399852 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.399797 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-22fc9\"" Apr 17 18:17:26.407494 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.407464 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk"] Apr 17 18:17:26.492736 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.492694 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.492904 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.492780 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzvnx\" (UniqueName: \"kubernetes.io/projected/5abeac77-be29-4876-a9ad-bd8910c6a970-kube-api-access-pzvnx\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.492904 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.492813 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.594320 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.594241 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.594571 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.594350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzvnx\" (UniqueName: \"kubernetes.io/projected/5abeac77-be29-4876-a9ad-bd8910c6a970-kube-api-access-pzvnx\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.594571 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.594379 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.594694 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.594630 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.594694 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.594666 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.603112 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.603085 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzvnx\" (UniqueName: \"kubernetes.io/projected/5abeac77-be29-4876-a9ad-bd8910c6a970-kube-api-access-pzvnx\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.707568 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.707471 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:26.828906 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:26.828789 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk"] Apr 17 18:17:26.831612 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:17:26.831580 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5abeac77_be29_4876_a9ad_bd8910c6a970.slice/crio-8f5ac8652f990e8b9c0d32187d0d9b1a7ab517c7ff6048833e19bd97e61e8abd WatchSource:0}: Error finding container 8f5ac8652f990e8b9c0d32187d0d9b1a7ab517c7ff6048833e19bd97e61e8abd: Status 404 returned error can't find the container with id 8f5ac8652f990e8b9c0d32187d0d9b1a7ab517c7ff6048833e19bd97e61e8abd Apr 17 18:17:27.004178 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:27.004086 2583 generic.go:358] "Generic (PLEG): container finished" podID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerID="ad58185329631510fdc81c9ec3ca8f640ec2be12e921e611424a08552abba147" exitCode=0 Apr 17 18:17:27.004550 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:27.004184 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" event={"ID":"5abeac77-be29-4876-a9ad-bd8910c6a970","Type":"ContainerDied","Data":"ad58185329631510fdc81c9ec3ca8f640ec2be12e921e611424a08552abba147"} Apr 17 18:17:27.004550 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:27.004217 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" event={"ID":"5abeac77-be29-4876-a9ad-bd8910c6a970","Type":"ContainerStarted","Data":"8f5ac8652f990e8b9c0d32187d0d9b1a7ab517c7ff6048833e19bd97e61e8abd"} Apr 17 18:17:30.016957 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:30.016923 2583 generic.go:358] "Generic (PLEG): container finished" podID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerID="b129ccdd7e4ad805775c8a27f4e6ded61c8506550a6ad8067298715e9dfde164" exitCode=0 Apr 17 18:17:30.017337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:30.016972 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" event={"ID":"5abeac77-be29-4876-a9ad-bd8910c6a970","Type":"ContainerDied","Data":"b129ccdd7e4ad805775c8a27f4e6ded61c8506550a6ad8067298715e9dfde164"} Apr 17 18:17:31.022864 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:31.022829 2583 generic.go:358] "Generic (PLEG): container finished" podID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerID="63b5b0b4225bacaf44948c61cea4dafa1d04f0d7ef4e84bf25d960f5634c40bb" exitCode=0 Apr 17 18:17:31.023232 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:31.022869 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" event={"ID":"5abeac77-be29-4876-a9ad-bd8910c6a970","Type":"ContainerDied","Data":"63b5b0b4225bacaf44948c61cea4dafa1d04f0d7ef4e84bf25d960f5634c40bb"} Apr 17 18:17:32.146856 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.146831 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:17:32.242868 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.242822 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzvnx\" (UniqueName: \"kubernetes.io/projected/5abeac77-be29-4876-a9ad-bd8910c6a970-kube-api-access-pzvnx\") pod \"5abeac77-be29-4876-a9ad-bd8910c6a970\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " Apr 17 18:17:32.243036 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.242882 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-bundle\") pod \"5abeac77-be29-4876-a9ad-bd8910c6a970\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " Apr 17 18:17:32.243036 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.242920 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-util\") pod \"5abeac77-be29-4876-a9ad-bd8910c6a970\" (UID: \"5abeac77-be29-4876-a9ad-bd8910c6a970\") " Apr 17 18:17:32.243364 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.243329 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-bundle" (OuterVolumeSpecName: "bundle") pod "5abeac77-be29-4876-a9ad-bd8910c6a970" (UID: "5abeac77-be29-4876-a9ad-bd8910c6a970"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:17:32.245245 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.245220 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5abeac77-be29-4876-a9ad-bd8910c6a970-kube-api-access-pzvnx" (OuterVolumeSpecName: "kube-api-access-pzvnx") pod "5abeac77-be29-4876-a9ad-bd8910c6a970" (UID: "5abeac77-be29-4876-a9ad-bd8910c6a970"). InnerVolumeSpecName "kube-api-access-pzvnx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:17:32.248325 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.248291 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-util" (OuterVolumeSpecName: "util") pod "5abeac77-be29-4876-a9ad-bd8910c6a970" (UID: "5abeac77-be29-4876-a9ad-bd8910c6a970"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:17:32.343779 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.343742 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzvnx\" (UniqueName: \"kubernetes.io/projected/5abeac77-be29-4876-a9ad-bd8910c6a970-kube-api-access-pzvnx\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:17:32.343779 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.343775 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-bundle\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:17:32.343779 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:32.343785 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5abeac77-be29-4876-a9ad-bd8910c6a970-util\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:17:33.030765 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:33.030730 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" event={"ID":"5abeac77-be29-4876-a9ad-bd8910c6a970","Type":"ContainerDied","Data":"8f5ac8652f990e8b9c0d32187d0d9b1a7ab517c7ff6048833e19bd97e61e8abd"} Apr 17 18:17:33.030765 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:33.030761 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f5ac8652f990e8b9c0d32187d0d9b1a7ab517c7ff6048833e19bd97e61e8abd" Apr 17 18:17:33.030965 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:17:33.030782 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78es22jk" Apr 17 18:19:48.522090 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522055 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56cddb7b49-l65rl"] Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522376 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="util" Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522388 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="util" Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522400 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="pull" Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522406 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="pull" Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522414 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="extract" Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522419 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="extract" Apr 17 18:19:48.522572 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.522474 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="5abeac77-be29-4876-a9ad-bd8910c6a970" containerName="extract" Apr 17 18:19:48.524439 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.524418 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.537398 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.537375 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cddb7b49-l65rl"] Apr 17 18:19:48.560389 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560360 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-serving-cert\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.560520 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560395 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz7lx\" (UniqueName: \"kubernetes.io/projected/6de7a080-12c7-4719-9af3-1fe7bcfaa008-kube-api-access-bz7lx\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.560520 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560414 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-config\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.560520 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560475 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-oauth-serving-cert\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.560620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560522 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-service-ca\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.560620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560556 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-oauth-config\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.560620 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.560604 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-trusted-ca-bundle\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661504 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661468 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-trusted-ca-bundle\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661534 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-serving-cert\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661552 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7lx\" (UniqueName: \"kubernetes.io/projected/6de7a080-12c7-4719-9af3-1fe7bcfaa008-kube-api-access-bz7lx\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661570 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-config\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661592 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-oauth-serving-cert\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661615 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-service-ca\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.661685 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.661643 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-oauth-config\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.662491 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.662402 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-config\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.662491 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.662464 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-trusted-ca-bundle\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.662701 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.662529 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-service-ca\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.662701 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.662399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6de7a080-12c7-4719-9af3-1fe7bcfaa008-oauth-serving-cert\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.664241 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.664220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-oauth-config\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.664377 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.664360 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6de7a080-12c7-4719-9af3-1fe7bcfaa008-console-serving-cert\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.669524 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.669494 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7lx\" (UniqueName: \"kubernetes.io/projected/6de7a080-12c7-4719-9af3-1fe7bcfaa008-kube-api-access-bz7lx\") pod \"console-56cddb7b49-l65rl\" (UID: \"6de7a080-12c7-4719-9af3-1fe7bcfaa008\") " pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.833724 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.833689 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:48.954358 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:48.954319 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cddb7b49-l65rl"] Apr 17 18:19:48.956564 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:19:48.956534 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de7a080_12c7_4719_9af3_1fe7bcfaa008.slice/crio-112f308ff7bbb71aa3b3f07db1e839a6a67dc336a6b90edd62e655a4d5e013ad WatchSource:0}: Error finding container 112f308ff7bbb71aa3b3f07db1e839a6a67dc336a6b90edd62e655a4d5e013ad: Status 404 returned error can't find the container with id 112f308ff7bbb71aa3b3f07db1e839a6a67dc336a6b90edd62e655a4d5e013ad Apr 17 18:19:49.489212 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:49.489174 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cddb7b49-l65rl" event={"ID":"6de7a080-12c7-4719-9af3-1fe7bcfaa008","Type":"ContainerStarted","Data":"46c21e0da189407d0c800dd22357756f0b5441c48f986692b25de6a1c0d00ee9"} Apr 17 18:19:49.489212 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:49.489215 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cddb7b49-l65rl" event={"ID":"6de7a080-12c7-4719-9af3-1fe7bcfaa008","Type":"ContainerStarted","Data":"112f308ff7bbb71aa3b3f07db1e839a6a67dc336a6b90edd62e655a4d5e013ad"} Apr 17 18:19:49.508238 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:49.508187 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56cddb7b49-l65rl" podStartSLOduration=1.508174184 podStartE2EDuration="1.508174184s" podCreationTimestamp="2026-04-17 18:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:19:49.507001399 +0000 UTC m=+582.382394123" watchObservedRunningTime="2026-04-17 18:19:49.508174184 +0000 UTC m=+582.383566893" Apr 17 18:19:58.833915 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:58.833881 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:58.834341 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:58.833924 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:58.838782 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:58.838761 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:59.526243 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:59.526207 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56cddb7b49-l65rl" Apr 17 18:19:59.589106 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:19:59.589071 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756d58777d-vdlcn"] Apr 17 18:20:04.950565 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.950526 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z7h5c/must-gather-7djdz"] Apr 17 18:20:04.952821 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.952800 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:04.955590 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.955562 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-z7h5c\"/\"openshift-service-ca.crt\"" Apr 17 18:20:04.956784 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.956760 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-z7h5c\"/\"default-dockercfg-g9cjj\"" Apr 17 18:20:04.956784 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.956777 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-z7h5c\"/\"kube-root-ca.crt\"" Apr 17 18:20:04.962480 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.962458 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z7h5c/must-gather-7djdz"] Apr 17 18:20:04.997402 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.997372 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s24t6\" (UniqueName: \"kubernetes.io/projected/7ee168bc-677b-4255-bdea-b984a2fdb145-kube-api-access-s24t6\") pod \"must-gather-7djdz\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:04.997516 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:04.997410 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ee168bc-677b-4255-bdea-b984a2fdb145-must-gather-output\") pod \"must-gather-7djdz\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:05.098087 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.098049 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s24t6\" (UniqueName: \"kubernetes.io/projected/7ee168bc-677b-4255-bdea-b984a2fdb145-kube-api-access-s24t6\") pod \"must-gather-7djdz\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:05.098087 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.098091 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ee168bc-677b-4255-bdea-b984a2fdb145-must-gather-output\") pod \"must-gather-7djdz\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:05.098426 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.098409 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ee168bc-677b-4255-bdea-b984a2fdb145-must-gather-output\") pod \"must-gather-7djdz\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:05.106931 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.106909 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s24t6\" (UniqueName: \"kubernetes.io/projected/7ee168bc-677b-4255-bdea-b984a2fdb145-kube-api-access-s24t6\") pod \"must-gather-7djdz\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:05.262345 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.262242 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:20:05.391725 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.391687 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z7h5c/must-gather-7djdz"] Apr 17 18:20:05.394038 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:20:05.394011 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ee168bc_677b_4255_bdea_b984a2fdb145.slice/crio-de6afd0e1b8ecb5da1c32d4a9d808ff45832ce7dbe62d78f32d58a5332c501a8 WatchSource:0}: Error finding container de6afd0e1b8ecb5da1c32d4a9d808ff45832ce7dbe62d78f32d58a5332c501a8: Status 404 returned error can't find the container with id de6afd0e1b8ecb5da1c32d4a9d808ff45832ce7dbe62d78f32d58a5332c501a8 Apr 17 18:20:05.542940 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:05.542855 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z7h5c/must-gather-7djdz" event={"ID":"7ee168bc-677b-4255-bdea-b984a2fdb145","Type":"ContainerStarted","Data":"de6afd0e1b8ecb5da1c32d4a9d808ff45832ce7dbe62d78f32d58a5332c501a8"} Apr 17 18:20:10.563481 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:10.563439 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z7h5c/must-gather-7djdz" event={"ID":"7ee168bc-677b-4255-bdea-b984a2fdb145","Type":"ContainerStarted","Data":"43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777"} Apr 17 18:20:10.563481 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:10.563485 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z7h5c/must-gather-7djdz" event={"ID":"7ee168bc-677b-4255-bdea-b984a2fdb145","Type":"ContainerStarted","Data":"3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c"} Apr 17 18:20:10.581407 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:10.581357 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z7h5c/must-gather-7djdz" podStartSLOduration=2.017282421 podStartE2EDuration="6.581336314s" podCreationTimestamp="2026-04-17 18:20:04 +0000 UTC" firstStartedPulling="2026-04-17 18:20:05.396100814 +0000 UTC m=+598.271493502" lastFinishedPulling="2026-04-17 18:20:09.960154705 +0000 UTC m=+602.835547395" observedRunningTime="2026-04-17 18:20:10.57960668 +0000 UTC m=+603.454999391" watchObservedRunningTime="2026-04-17 18:20:10.581336314 +0000 UTC m=+603.456729066" Apr 17 18:20:24.610498 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.610393 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-756d58777d-vdlcn" podUID="ef4c3052-8bf0-40c7-b362-fea4c3063c25" containerName="console" containerID="cri-o://db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a" gracePeriod=15 Apr 17 18:20:24.870013 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.869944 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756d58777d-vdlcn_ef4c3052-8bf0-40c7-b362-fea4c3063c25/console/0.log" Apr 17 18:20:24.870013 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.870004 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:20:24.982786 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982751 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-config\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.982966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982806 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-oauth-config\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.982966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982829 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-serving-cert\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.982966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982849 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-trusted-ca-bundle\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.982966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982882 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-service-ca\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.982966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982921 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dtjm\" (UniqueName: \"kubernetes.io/projected/ef4c3052-8bf0-40c7-b362-fea4c3063c25-kube-api-access-6dtjm\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.982966 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.982967 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-oauth-serving-cert\") pod \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\" (UID: \"ef4c3052-8bf0-40c7-b362-fea4c3063c25\") " Apr 17 18:20:24.983313 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.983067 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-config" (OuterVolumeSpecName: "console-config") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:20:24.983313 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.983212 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:20:24.983313 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.983247 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-config\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:24.983313 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.983256 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-service-ca" (OuterVolumeSpecName: "service-ca") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:20:24.983512 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.983483 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:20:24.985221 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.985175 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:20:24.985376 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.985294 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4c3052-8bf0-40c7-b362-fea4c3063c25-kube-api-access-6dtjm" (OuterVolumeSpecName: "kube-api-access-6dtjm") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "kube-api-access-6dtjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:20:24.985376 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:24.985307 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ef4c3052-8bf0-40c7-b362-fea4c3063c25" (UID: "ef4c3052-8bf0-40c7-b362-fea4c3063c25"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:20:25.083756 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.083721 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-oauth-config\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:25.083756 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.083755 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4c3052-8bf0-40c7-b362-fea4c3063c25-console-serving-cert\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:25.083982 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.083772 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-trusted-ca-bundle\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:25.083982 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.083782 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-service-ca\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:25.083982 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.083792 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dtjm\" (UniqueName: \"kubernetes.io/projected/ef4c3052-8bf0-40c7-b362-fea4c3063c25-kube-api-access-6dtjm\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:25.083982 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.083800 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ef4c3052-8bf0-40c7-b362-fea4c3063c25-oauth-serving-cert\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:20:25.619911 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.619878 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756d58777d-vdlcn_ef4c3052-8bf0-40c7-b362-fea4c3063c25/console/0.log" Apr 17 18:20:25.620328 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.619932 2583 generic.go:358] "Generic (PLEG): container finished" podID="ef4c3052-8bf0-40c7-b362-fea4c3063c25" containerID="db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a" exitCode=2 Apr 17 18:20:25.620328 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.619998 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756d58777d-vdlcn" Apr 17 18:20:25.620328 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.620023 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756d58777d-vdlcn" event={"ID":"ef4c3052-8bf0-40c7-b362-fea4c3063c25","Type":"ContainerDied","Data":"db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a"} Apr 17 18:20:25.620328 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.620061 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756d58777d-vdlcn" event={"ID":"ef4c3052-8bf0-40c7-b362-fea4c3063c25","Type":"ContainerDied","Data":"83e45d9cfa26c9075c706faa05841eb349d90af51bee8fdee546cd0578589ca5"} Apr 17 18:20:25.620328 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.620076 2583 scope.go:117] "RemoveContainer" containerID="db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a" Apr 17 18:20:25.633671 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.633645 2583 scope.go:117] "RemoveContainer" containerID="db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a" Apr 17 18:20:25.633965 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:20:25.633940 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a\": container with ID starting with db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a not found: ID does not exist" containerID="db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a" Apr 17 18:20:25.634028 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.633978 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a"} err="failed to get container status \"db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a\": rpc error: code = NotFound desc = could not find container \"db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a\": container with ID starting with db56321be1b845c3ebbf18697ca78679bf1d3a4e6e1f2e990587e6327d3c254a not found: ID does not exist" Apr 17 18:20:25.646823 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.646798 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756d58777d-vdlcn"] Apr 17 18:20:25.653972 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.653943 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-756d58777d-vdlcn"] Apr 17 18:20:25.707644 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:25.707608 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4c3052-8bf0-40c7-b362-fea4c3063c25" path="/var/lib/kubelet/pods/ef4c3052-8bf0-40c7-b362-fea4c3063c25/volumes" Apr 17 18:20:56.735863 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:56.735827 2583 generic.go:358] "Generic (PLEG): container finished" podID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerID="3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c" exitCode=0 Apr 17 18:20:56.736299 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:56.735903 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z7h5c/must-gather-7djdz" event={"ID":"7ee168bc-677b-4255-bdea-b984a2fdb145","Type":"ContainerDied","Data":"3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c"} Apr 17 18:20:56.736299 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:56.736231 2583 scope.go:117] "RemoveContainer" containerID="3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c" Apr 17 18:20:57.092061 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:20:57.092031 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z7h5c_must-gather-7djdz_7ee168bc-677b-4255-bdea-b984a2fdb145/gather/0.log" Apr 17 18:21:00.248622 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:00.248587 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pqbjx_15a8ea75-9ba5-4f38-9236-0c6de8eae8e9/global-pull-secret-syncer/0.log" Apr 17 18:21:00.350221 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:00.350178 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-st4n8_ce6edf92-e02a-474f-864a-5bd1153d24d6/konnectivity-agent/0.log" Apr 17 18:21:00.404695 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:00.404663 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-142.ec2.internal_652ed875b090027ea0cb6468dfc50153/haproxy/0.log" Apr 17 18:21:02.423052 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.423015 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z7h5c/must-gather-7djdz"] Apr 17 18:21:02.423449 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.423234 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-z7h5c/must-gather-7djdz" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="copy" containerID="cri-o://43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777" gracePeriod=2 Apr 17 18:21:02.429147 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.429120 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z7h5c/must-gather-7djdz"] Apr 17 18:21:02.655241 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.655218 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z7h5c_must-gather-7djdz_7ee168bc-677b-4255-bdea-b984a2fdb145/copy/0.log" Apr 17 18:21:02.655640 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.655623 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:21:02.657950 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.657923 2583 status_manager.go:895] "Failed to get status for pod" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" pod="openshift-must-gather-z7h5c/must-gather-7djdz" err="pods \"must-gather-7djdz\" is forbidden: User \"system:node:ip-10-0-133-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z7h5c\": no relationship found between node 'ip-10-0-133-142.ec2.internal' and this object" Apr 17 18:21:02.704159 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.704055 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s24t6\" (UniqueName: \"kubernetes.io/projected/7ee168bc-677b-4255-bdea-b984a2fdb145-kube-api-access-s24t6\") pod \"7ee168bc-677b-4255-bdea-b984a2fdb145\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " Apr 17 18:21:02.704159 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.704140 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ee168bc-677b-4255-bdea-b984a2fdb145-must-gather-output\") pod \"7ee168bc-677b-4255-bdea-b984a2fdb145\" (UID: \"7ee168bc-677b-4255-bdea-b984a2fdb145\") " Apr 17 18:21:02.705573 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.705539 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee168bc-677b-4255-bdea-b984a2fdb145-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7ee168bc-677b-4255-bdea-b984a2fdb145" (UID: "7ee168bc-677b-4255-bdea-b984a2fdb145"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:21:02.706434 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.706414 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee168bc-677b-4255-bdea-b984a2fdb145-kube-api-access-s24t6" (OuterVolumeSpecName: "kube-api-access-s24t6") pod "7ee168bc-677b-4255-bdea-b984a2fdb145" (UID: "7ee168bc-677b-4255-bdea-b984a2fdb145"). InnerVolumeSpecName "kube-api-access-s24t6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:21:02.756544 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.756514 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z7h5c_must-gather-7djdz_7ee168bc-677b-4255-bdea-b984a2fdb145/copy/0.log" Apr 17 18:21:02.756832 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.756809 2583 generic.go:358] "Generic (PLEG): container finished" podID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerID="43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777" exitCode=143 Apr 17 18:21:02.756879 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.756866 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z7h5c/must-gather-7djdz" Apr 17 18:21:02.756913 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.756905 2583 scope.go:117] "RemoveContainer" containerID="43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777" Apr 17 18:21:02.759240 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.759212 2583 status_manager.go:895] "Failed to get status for pod" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" pod="openshift-must-gather-z7h5c/must-gather-7djdz" err="pods \"must-gather-7djdz\" is forbidden: User \"system:node:ip-10-0-133-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z7h5c\": no relationship found between node 'ip-10-0-133-142.ec2.internal' and this object" Apr 17 18:21:02.764573 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.764554 2583 scope.go:117] "RemoveContainer" containerID="3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c" Apr 17 18:21:02.766921 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.766898 2583 status_manager.go:895] "Failed to get status for pod" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" pod="openshift-must-gather-z7h5c/must-gather-7djdz" err="pods \"must-gather-7djdz\" is forbidden: User \"system:node:ip-10-0-133-142.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-z7h5c\": no relationship found between node 'ip-10-0-133-142.ec2.internal' and this object" Apr 17 18:21:02.776825 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.776810 2583 scope.go:117] "RemoveContainer" containerID="43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777" Apr 17 18:21:02.777068 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:21:02.777049 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777\": container with ID starting with 43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777 not found: ID does not exist" containerID="43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777" Apr 17 18:21:02.777129 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.777077 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777"} err="failed to get container status \"43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777\": rpc error: code = NotFound desc = could not find container \"43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777\": container with ID starting with 43e0c92554668d17167bfb60f52c79d4af80302b3a0fe4a71f93181636cf9777 not found: ID does not exist" Apr 17 18:21:02.777129 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.777098 2583 scope.go:117] "RemoveContainer" containerID="3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c" Apr 17 18:21:02.777329 ip-10-0-133-142 kubenswrapper[2583]: E0417 18:21:02.777304 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c\": container with ID starting with 3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c not found: ID does not exist" containerID="3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c" Apr 17 18:21:02.777367 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.777332 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c"} err="failed to get container status \"3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c\": rpc error: code = NotFound desc = could not find container \"3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c\": container with ID starting with 3881a1c1c6d03cb8de9118b6a60039953219faae73d76759883cdce19991463c not found: ID does not exist" Apr 17 18:21:02.805401 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.805372 2583 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7ee168bc-677b-4255-bdea-b984a2fdb145-must-gather-output\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:21:02.805401 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:02.805404 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s24t6\" (UniqueName: \"kubernetes.io/projected/7ee168bc-677b-4255-bdea-b984a2fdb145-kube-api-access-s24t6\") on node \"ip-10-0-133-142.ec2.internal\" DevicePath \"\"" Apr 17 18:21:03.332671 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.332644 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kcr7m_ac6d7493-5df5-4aa0-8274-2bfb5f2d3b8a/cluster-monitoring-operator/0.log" Apr 17 18:21:03.384328 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.384300 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jljrv_a5b54589-b3db-48ee-8c9e-111a4446a476/kube-state-metrics/0.log" Apr 17 18:21:03.416264 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.416229 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jljrv_a5b54589-b3db-48ee-8c9e-111a4446a476/kube-rbac-proxy-main/0.log" Apr 17 18:21:03.447542 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.447514 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jljrv_a5b54589-b3db-48ee-8c9e-111a4446a476/kube-rbac-proxy-self/0.log" Apr 17 18:21:03.708248 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.708163 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" path="/var/lib/kubelet/pods/7ee168bc-677b-4255-bdea-b984a2fdb145/volumes" Apr 17 18:21:03.737224 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.737198 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pqdwp_93258dde-b8e4-45f7-a919-f6cb6b76e9b2/node-exporter/0.log" Apr 17 18:21:03.763087 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.763040 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pqdwp_93258dde-b8e4-45f7-a919-f6cb6b76e9b2/kube-rbac-proxy/0.log" Apr 17 18:21:03.792185 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:03.792153 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pqdwp_93258dde-b8e4-45f7-a919-f6cb6b76e9b2/init-textfile/0.log" Apr 17 18:21:04.222085 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.222060 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68989df595-zklhs_b2a09275-636d-44a9-8780-410b1d31715f/telemeter-client/0.log" Apr 17 18:21:04.250750 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.250719 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68989df595-zklhs_b2a09275-636d-44a9-8780-410b1d31715f/reload/0.log" Apr 17 18:21:04.274910 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.274887 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68989df595-zklhs_b2a09275-636d-44a9-8780-410b1d31715f/kube-rbac-proxy/0.log" Apr 17 18:21:04.309532 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.309505 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59d88d4c87-s2p4x_54c06acc-691c-41df-b0a5-5e3bed98acf5/thanos-query/0.log" Apr 17 18:21:04.337945 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.337914 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59d88d4c87-s2p4x_54c06acc-691c-41df-b0a5-5e3bed98acf5/kube-rbac-proxy-web/0.log" Apr 17 18:21:04.365053 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.365026 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59d88d4c87-s2p4x_54c06acc-691c-41df-b0a5-5e3bed98acf5/kube-rbac-proxy/0.log" Apr 17 18:21:04.391930 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.391900 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59d88d4c87-s2p4x_54c06acc-691c-41df-b0a5-5e3bed98acf5/prom-label-proxy/0.log" Apr 17 18:21:04.419801 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.419772 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59d88d4c87-s2p4x_54c06acc-691c-41df-b0a5-5e3bed98acf5/kube-rbac-proxy-rules/0.log" Apr 17 18:21:04.447700 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:04.447675 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-59d88d4c87-s2p4x_54c06acc-691c-41df-b0a5-5e3bed98acf5/kube-rbac-proxy-metrics/0.log" Apr 17 18:21:06.407270 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.407234 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cddb7b49-l65rl_6de7a080-12c7-4719-9af3-1fe7bcfaa008/console/0.log" Apr 17 18:21:06.600343 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600305 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5"] Apr 17 18:21:06.600670 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600658 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef4c3052-8bf0-40c7-b362-fea4c3063c25" containerName="console" Apr 17 18:21:06.600711 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600672 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4c3052-8bf0-40c7-b362-fea4c3063c25" containerName="console" Apr 17 18:21:06.600711 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600684 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="copy" Apr 17 18:21:06.600711 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600690 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="copy" Apr 17 18:21:06.600711 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600709 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="gather" Apr 17 18:21:06.600850 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600715 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="gather" Apr 17 18:21:06.600850 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600758 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef4c3052-8bf0-40c7-b362-fea4c3063c25" containerName="console" Apr 17 18:21:06.600850 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600766 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="copy" Apr 17 18:21:06.600850 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.600773 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ee168bc-677b-4255-bdea-b984a2fdb145" containerName="gather" Apr 17 18:21:06.606020 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.605996 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.608733 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.608707 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9zls8\"/\"default-dockercfg-49jmw\"" Apr 17 18:21:06.608859 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.608752 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9zls8\"/\"openshift-service-ca.crt\"" Apr 17 18:21:06.608859 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.608773 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9zls8\"/\"kube-root-ca.crt\"" Apr 17 18:21:06.616848 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.616826 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5"] Apr 17 18:21:06.737134 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.737042 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nph5s\" (UniqueName: \"kubernetes.io/projected/d0c3a521-6f83-489b-9f09-920bf120eebf-kube-api-access-nph5s\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.737134 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.737095 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-proc\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.737380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.737150 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-podres\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.737380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.737208 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-sys\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.737380 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.737247 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-lib-modules\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.837872 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.837841 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nph5s\" (UniqueName: \"kubernetes.io/projected/d0c3a521-6f83-489b-9f09-920bf120eebf-kube-api-access-nph5s\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838035 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.837987 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-proc\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838087 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838056 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-podres\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838188 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838167 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-proc\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838209 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-sys\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838252 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838235 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-podres\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838356 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-lib-modules\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838399 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838373 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-sys\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.838433 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.838371 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0c3a521-6f83-489b-9f09-920bf120eebf-lib-modules\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.846985 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.846964 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nph5s\" (UniqueName: \"kubernetes.io/projected/d0c3a521-6f83-489b-9f09-920bf120eebf-kube-api-access-nph5s\") pod \"perf-node-gather-daemonset-jc5r5\" (UID: \"d0c3a521-6f83-489b-9f09-920bf120eebf\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:06.855614 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.855596 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-gwg7g_b9febd3b-c712-4be5-b213-f682bf52fa59/volume-data-source-validator/0.log" Apr 17 18:21:06.916531 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:06.916496 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:07.035891 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.035867 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5"] Apr 17 18:21:07.038724 ip-10-0-133-142 kubenswrapper[2583]: W0417 18:21:07.038697 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd0c3a521_6f83_489b_9f09_920bf120eebf.slice/crio-4d2f9c6c68be5cd364cda028ed452958319c2d90c4a387b394db084dd97d4a4b WatchSource:0}: Error finding container 4d2f9c6c68be5cd364cda028ed452958319c2d90c4a387b394db084dd97d4a4b: Status 404 returned error can't find the container with id 4d2f9c6c68be5cd364cda028ed452958319c2d90c4a387b394db084dd97d4a4b Apr 17 18:21:07.040300 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.040254 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:21:07.590668 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.590637 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xqpjb_31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c/dns/0.log" Apr 17 18:21:07.614002 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.613970 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xqpjb_31cfe9ce-5d53-4e66-b4f4-bf27d97a6a1c/kube-rbac-proxy/0.log" Apr 17 18:21:07.700268 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.700241 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rl4d7_854b67d6-4dbb-4558-9444-235eb53b9278/dns-node-resolver/0.log" Apr 17 18:21:07.773045 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.773009 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" event={"ID":"d0c3a521-6f83-489b-9f09-920bf120eebf","Type":"ContainerStarted","Data":"3634e432888a0f68d86e36a91e12245f35fc1f83b5fdc9b54072622f5efa631c"} Apr 17 18:21:07.773045 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.773048 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" event={"ID":"d0c3a521-6f83-489b-9f09-920bf120eebf","Type":"ContainerStarted","Data":"4d2f9c6c68be5cd364cda028ed452958319c2d90c4a387b394db084dd97d4a4b"} Apr 17 18:21:07.773337 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.773145 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:07.790439 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:07.790392 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" podStartSLOduration=1.790378351 podStartE2EDuration="1.790378351s" podCreationTimestamp="2026-04-17 18:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:21:07.788827918 +0000 UTC m=+660.664220628" watchObservedRunningTime="2026-04-17 18:21:07.790378351 +0000 UTC m=+660.665771060" Apr 17 18:21:08.102582 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:08.102554 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b86858b4b-z8c9s_ff33bbab-bce2-44ed-9c40-a35c751f8fa7/registry/0.log" Apr 17 18:21:08.153400 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:08.153369 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8pwg5_75b35535-d264-462c-a620-1b59e57c1eef/node-ca/0.log" Apr 17 18:21:08.852887 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:08.852843 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-59b48b8f4-5st7q_e8dffa5d-56ee-4b70-a3be-9aa2dea734cd/router/0.log" Apr 17 18:21:09.173651 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:09.173575 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6f76l_1f9df124-3418-493f-8f5e-bd5ea9df2004/serve-healthcheck-canary/0.log" Apr 17 18:21:09.583364 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:09.583335 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bmqm9_b6f8c7ee-24c2-4294-928d-435c48c0b667/kube-rbac-proxy/0.log" Apr 17 18:21:09.607494 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:09.607465 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bmqm9_b6f8c7ee-24c2-4294-928d-435c48c0b667/exporter/0.log" Apr 17 18:21:09.631547 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:09.631519 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bmqm9_b6f8c7ee-24c2-4294-928d-435c48c0b667/extractor/0.log" Apr 17 18:21:13.786675 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:13.786648 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-jc5r5" Apr 17 18:21:15.044212 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:15.044179 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z2z5x_212d9a00-537f-41e9-b6bf-b14feb7f40a5/kube-storage-version-migrator-operator/1.log" Apr 17 18:21:15.045105 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:15.045087 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z2z5x_212d9a00-537f-41e9-b6bf-b14feb7f40a5/kube-storage-version-migrator-operator/0.log" Apr 17 18:21:16.180522 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.180493 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9nr2v_ce7310b9-648a-4042-86fe-ef118fc7af4e/kube-multus/0.log" Apr 17 18:21:16.417154 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.417080 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/kube-multus-additional-cni-plugins/0.log" Apr 17 18:21:16.441546 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.441523 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/egress-router-binary-copy/0.log" Apr 17 18:21:16.469264 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.469238 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/cni-plugins/0.log" Apr 17 18:21:16.496235 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.496210 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/bond-cni-plugin/0.log" Apr 17 18:21:16.523956 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.523918 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/routeoverride-cni/0.log" Apr 17 18:21:16.547567 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.547540 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/whereabouts-cni-bincopy/0.log" Apr 17 18:21:16.575103 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.575076 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sj2s9_b5e8e12b-b8af-4a75-8a47-fc6090430623/whereabouts-cni/0.log" Apr 17 18:21:16.783250 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.783182 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6d44x_2ea7fcad-19ae-42ab-8026-113afe4c2f23/network-metrics-daemon/0.log" Apr 17 18:21:16.805727 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:16.805698 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6d44x_2ea7fcad-19ae-42ab-8026-113afe4c2f23/kube-rbac-proxy/0.log" Apr 17 18:21:17.993733 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:17.993701 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/ovn-controller/0.log" Apr 17 18:21:18.018676 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.018596 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/ovn-acl-logging/0.log" Apr 17 18:21:18.043605 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.043575 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/kube-rbac-proxy-node/0.log" Apr 17 18:21:18.066688 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.066664 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:21:18.086785 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.086764 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/northd/0.log" Apr 17 18:21:18.110800 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.110777 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/nbdb/0.log" Apr 17 18:21:18.146491 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.146468 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/sbdb/0.log" Apr 17 18:21:18.260605 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:18.260574 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ql5pl_3b758c69-286f-4851-8bab-2922e791af32/ovnkube-controller/0.log" Apr 17 18:21:19.735203 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:19.735176 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kgjhh_d2df8bcd-2956-4041-abb8-966ec57fce1d/network-check-target-container/0.log" Apr 17 18:21:20.735114 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:20.735079 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-45lrm_1ab1f86f-6203-4036-b656-aeacb3b958ba/iptables-alerter/0.log" Apr 17 18:21:21.447438 ip-10-0-133-142 kubenswrapper[2583]: I0417 18:21:21.447413 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7tsjs_37d75c3f-00cd-47b8-80a8-e9ed9af8a917/tuned/0.log"