Apr 24 21:26:22.470657 ip-10-0-131-58 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:22.967564 ip-10-0-131-58 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:22.967564 ip-10-0-131-58 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:22.967564 ip-10-0-131-58 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:22.967564 ip-10-0-131-58 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:22.967564 ip-10-0-131-58 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:22.969199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.969111 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:22.971555 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971538 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:22.971555 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971554 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971558 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971562 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971565 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971568 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971571 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971574 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971576 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971579 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971582 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971584 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971594 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971597 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971600 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971603 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971605 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971608 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971611 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971613 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971616 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:22.971619 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971619 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971622 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971624 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971627 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971631 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971635 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971638 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971641 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971644 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971647 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971650 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971652 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971655 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971658 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971661 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971664 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971666 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971669 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971671 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:22.972102 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971674 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971676 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971679 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971681 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971684 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971686 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971689 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971692 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971694 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971697 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971700 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971702 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971705 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971708 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971711 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971714 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971716 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971719 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971722 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971725 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:22.972633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971727 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971730 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971732 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971735 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971737 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971740 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971742 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971747 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971750 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971753 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971756 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971758 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971761 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971764 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971766 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971769 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971772 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971775 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971777 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971780 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:22.973109 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971782 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:22.973624 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971786 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:22.973624 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971788 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:22.973624 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971791 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:22.973624 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971793 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:22.973624 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.971797 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:22.974405 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974394 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:22.974405 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974404 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974409 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974412 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974414 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974417 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974420 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974423 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974425 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974428 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974431 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974433 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974436 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974439 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974442 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974444 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974447 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974449 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974452 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974455 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974457 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:22.974466 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974460 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974462 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974465 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974468 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974471 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974474 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974477 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974479 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974482 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974485 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974487 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974490 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974492 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974495 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974497 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974500 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974502 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974505 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974507 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974510 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:22.974951 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974512 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974515 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974519 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974523 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974525 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974528 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974531 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974533 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974536 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974538 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974541 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974543 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974545 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974548 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974552 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974554 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974557 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974559 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974562 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974565 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:22.975538 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974567 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974570 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974572 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974575 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974578 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974580 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974583 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974586 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974588 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974591 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974593 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974595 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974601 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974604 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974607 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974610 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974613 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974616 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974619 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:22.976099 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974622 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974625 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974627 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974630 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974633 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.974635 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974709 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974719 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974729 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974736 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974742 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974746 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974751 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974755 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974759 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974761 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974765 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974768 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974771 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974774 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974777 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974780 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974783 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:22.976730 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974786 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974789 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974793 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974796 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974799 2575 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974802 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974806 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974810 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974813 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974816 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974820 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974823 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974826 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974828 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974832 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974835 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974840 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974843 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974846 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974849 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974852 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974855 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974859 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974862 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974865 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:22.977412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974869 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974872 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974875 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974878 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974881 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974884 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974887 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974891 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974894 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974897 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974901 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974904 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974907 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974910 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974913 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974917 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974920 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974924 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974927 2575 flags.go:64] FLAG: --help="false" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974930 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-131-58.ec2.internal" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974933 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974936 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974939 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974943 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:22.978100 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974946 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974949 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974952 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974955 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974958 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974961 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974964 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974967 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974970 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974973 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974976 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974979 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974982 2575 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974985 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974988 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974991 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.974997 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975000 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975003 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975006 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975009 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975012 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975015 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975018 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975022 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:22.978792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975025 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975029 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975033 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975036 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975038 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975041 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975045 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975048 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975051 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975064 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975067 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975070 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975073 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975076 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975082 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975084 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975088 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975090 2575 flags.go:64] FLAG: --port="10250" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975094 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975097 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0028b1c2f76c03b2a" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975100 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975103 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975107 2575 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975110 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:22.979526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975112 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975116 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975119 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975122 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975125 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975129 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975132 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975136 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975139 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975142 2575 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975145 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975149 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975152 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975155 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975157 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975160 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975163 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975166 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975169 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975172 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975175 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975178 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975181 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975184 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975188 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:22.980149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975193 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975196 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975199 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975203 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975206 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975211 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975214 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975217 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975220 2575 flags.go:64] FLAG: --v="2" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975224 2575 flags.go:64] FLAG: --version="false" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975228 2575 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975233 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.975236 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975360 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975364 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975367 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975370 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975373 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975377 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975379 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975382 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975385 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:22.980811 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975387 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975390 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975392 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975396 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975399 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975401 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975404 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975407 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975409 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975412 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975415 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975417 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975420 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975422 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975425 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975430 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975432 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975435 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975438 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975441 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:22.981403 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975443 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975446 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975448 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975452 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975455 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975458 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975461 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975464 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975467 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975470 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975473 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975475 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975478 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975480 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975483 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975486 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975488 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975491 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975493 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975496 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:22.981919 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975505 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975508 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975511 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975513 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975516 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975519 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975525 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975529 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975532 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975535 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975537 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975540 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975544 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975548 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975551 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975554 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975556 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975559 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975561 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:22.982432 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975564 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975566 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975569 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975573 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975575 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975578 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975580 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975583 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975585 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975588 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975591 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975593 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975596 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975599 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975601 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975604 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975606 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:22.983117 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.975609 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:22.983757 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.976348 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:22.986647 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.986627 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:22.986647 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.986647 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986699 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986704 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986708 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986712 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986715 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986717 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986720 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986723 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986726 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986729 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986731 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986734 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986736 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986739 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986742 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986744 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986747 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986750 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986752 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:22.986748 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986757 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986762 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986766 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986769 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986772 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986775 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986778 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986781 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986783 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986786 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986789 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986791 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986794 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986796 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986799 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986803 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986807 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986809 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986812 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986814 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:22.987241 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986817 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986819 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986822 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986824 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986827 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986829 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986832 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986834 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986837 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986839 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986842 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986844 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986847 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986849 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986853 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986857 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986860 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986862 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986865 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986867 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:22.987799 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986870 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986872 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986875 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986878 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986880 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986883 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986885 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986888 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986890 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986893 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986896 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986898 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986901 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986904 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986906 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986909 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986911 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986915 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986918 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:22.988319 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986920 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986923 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986925 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986928 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986931 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986933 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986936 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.986939 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.986944 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987038 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987044 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987046 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987050 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987052 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987055 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987058 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:22.988791 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987061 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987063 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987067 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987071 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987074 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987077 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987080 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987082 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987085 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987088 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987090 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987093 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987095 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987098 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987100 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987104 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987107 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987109 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987112 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:22.989209 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987115 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987117 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987120 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987122 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987125 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987127 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987130 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987133 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987135 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987138 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987140 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987142 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987145 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987148 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987150 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987153 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987155 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987159 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987161 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987164 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:22.989689 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987166 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987169 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987171 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987174 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987176 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987179 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987181 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987184 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987186 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987189 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987192 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987194 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987197 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987199 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987202 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987204 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987207 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987209 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987212 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987216 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:22.990195 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987219 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987222 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987225 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987227 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987230 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987232 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987235 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987237 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987240 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987243 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987245 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987248 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987251 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987266 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987270 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987273 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987276 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987278 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987281 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:22.990709 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:22.987283 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:22.991164 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.987289 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:22.991164 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.988089 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:22.991223 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.991205 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:22.992252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.992240 2575 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:22.992367 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.992348 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:22.992397 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:22.992392 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:23.019783 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.019761 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:23.023473 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.023290 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:23.040425 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.040399 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:23.046367 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.046352 2575 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:23.050033 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.050018 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:23.052311 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.052294 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:23.055303 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.055284 2575 fs.go:135] Filesystem UUIDs: map[67ff7f5f-7e0f-420f-a014-a57e81b6744a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 bef6c630-bb09-4a56-81dd-95a4bf00912e:/dev/nvme0n1p4] Apr 24 21:26:23.055363 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.055303 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:23.062546 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.062429 2575 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:23.060341788 +0000 UTC m=+0.463604777 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098238 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2891d1dac29a9293f395b137df4e8b SystemUUID:ec2891d1-dac2-9a92-93f3-95b137df4e8b BootID:99e29c8b-0a62-49a9-9f5a-e25e53265abb Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1a:74:37:7d:05 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1a:74:37:7d:05 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:6c:f0:6e:74:a6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:23.062546 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.062534 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:23.062682 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.062625 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:23.063757 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.063733 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:23.063894 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.063759 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-58.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:23.063940 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.063902 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:23.063940 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.063911 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:23.063940 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.063924 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:23.064776 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.064766 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:23.066579 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.066568 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:23.066686 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.066677 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:23.070827 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.070817 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:23.070868 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.070832 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:23.070868 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.070844 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:23.070868 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.070853 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:23.070868 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.070862 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:23.072058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.072046 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:23.072105 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.072065 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:23.075629 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.075608 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:23.077912 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.077894 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:23.079342 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079328 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079348 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079356 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079364 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079372 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079380 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079389 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079397 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079408 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:23.079423 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079418 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:23.079738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079441 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:23.079738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.079456 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:23.081538 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.081524 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:23.081597 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.081542 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:23.083592 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.083568 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-58.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:23.083675 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.083610 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:23.085491 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.085477 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:23.085576 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.085518 2575 server.go:1295] "Started kubelet" Apr 24 21:26:23.085643 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.085605 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:23.085711 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.085669 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:23.085750 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.085735 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:23.086502 ip-10-0-131-58 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:23.088927 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.088915 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:23.089679 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.089660 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:23.094791 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.094770 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-58.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:23.095399 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.094394 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-58.ec2.internal.18a96818a83da9dc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-58.ec2.internal,UID:ip-10-0-131-58.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-58.ec2.internal,},FirstTimestamp:2026-04-24 21:26:23.085488604 +0000 UTC m=+0.488751594,LastTimestamp:2026-04-24 21:26:23.085488604 +0000 UTC m=+0.488751594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-58.ec2.internal,}" Apr 24 21:26:23.096294 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.096276 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:23.096914 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.096899 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:23.098176 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.097777 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:23.098176 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.098023 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:23.098176 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.098037 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:23.098176 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.098127 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:23.098416 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.098186 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:23.098416 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.098194 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:23.099161 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099145 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:23.099248 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099239 2575 factory.go:55] Registering systemd factory Apr 24 21:26:23.099337 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099328 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:23.099453 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.099423 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-58.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:26:23.099453 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.099307 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:26:23.099633 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099338 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wfn2p" Apr 24 21:26:23.099681 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099660 2575 factory.go:153] Registering CRI-O factory Apr 24 21:26:23.099681 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099674 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:23.099748 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099696 2575 factory.go:103] Registering Raw factory Apr 24 21:26:23.099748 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.099705 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:23.099828 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.099815 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.100063 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.100048 2575 manager.go:319] Starting recovery of all containers Apr 24 21:26:23.107621 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.107598 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wfn2p" Apr 24 21:26:23.110936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.110913 2575 manager.go:324] Recovery completed Apr 24 21:26:23.115160 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.115147 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:23.117813 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.117799 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:23.117883 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.117825 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:23.117883 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.117836 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:23.118398 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.118376 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:23.118398 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.118390 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:23.118483 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.118407 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:23.119954 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.119891 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-58.ec2.internal.18a96818aa2ae6cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-58.ec2.internal,UID:ip-10-0-131-58.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-58.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-58.ec2.internal,},FirstTimestamp:2026-04-24 21:26:23.117813452 +0000 UTC m=+0.521076436,LastTimestamp:2026-04-24 21:26:23.117813452 +0000 UTC m=+0.521076436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-58.ec2.internal,}" Apr 24 21:26:23.122459 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.122448 2575 policy_none.go:49] "None policy: Start" Apr 24 21:26:23.122513 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.122463 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:23.122513 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.122473 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:23.164829 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.164807 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.164837 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.164847 2575 server.go:85] "Starting device plugin registration server" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.165110 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.165120 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.165220 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.165346 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.165355 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.165858 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:23.191662 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.165894 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.213212 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.213169 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:23.214393 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.214371 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:23.214393 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.214398 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:23.214534 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.214421 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:23.214534 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.214432 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:23.214638 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.214532 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:23.216921 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.216904 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:23.266014 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.265935 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:23.267197 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.267181 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:23.267322 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.267216 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:23.267322 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.267230 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:23.267322 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.267277 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.274438 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.274422 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.274514 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.274445 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-58.ec2.internal\": node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.288018 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.287995 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.315211 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.315175 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal"] Apr 24 21:26:23.315324 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.315267 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:23.316169 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.316154 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:23.316228 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.316181 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:23.316228 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.316197 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:23.318425 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.318413 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:23.318571 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.318557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.318607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.318585 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:23.319168 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.319149 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:23.319168 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.319166 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:23.319308 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.319176 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:23.319308 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.319186 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:23.319308 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.319186 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:23.319308 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.319287 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:23.321428 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.321414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.321487 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.321438 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:23.322120 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.322106 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:23.322186 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.322130 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:23.322186 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.322143 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:23.348893 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.348873 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-58.ec2.internal\" not found" node="ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.353333 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.353313 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-58.ec2.internal\" not found" node="ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.388928 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.388908 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.400158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.400135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a3f552fe84c3a4f714667d8da3419094-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal\" (UID: \"a3f552fe84c3a4f714667d8da3419094\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.400224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.400169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3f552fe84c3a4f714667d8da3419094-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal\" (UID: \"a3f552fe84c3a4f714667d8da3419094\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.400224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.400189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9ec5360dd3ff0d724086a47bd35d554c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-58.ec2.internal\" (UID: \"9ec5360dd3ff0d724086a47bd35d554c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.489982 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.489949 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.500327 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.500305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a3f552fe84c3a4f714667d8da3419094-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal\" (UID: \"a3f552fe84c3a4f714667d8da3419094\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.500371 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.500334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3f552fe84c3a4f714667d8da3419094-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal\" (UID: \"a3f552fe84c3a4f714667d8da3419094\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.500371 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.500350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9ec5360dd3ff0d724086a47bd35d554c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-58.ec2.internal\" (UID: \"9ec5360dd3ff0d724086a47bd35d554c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.500434 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.500414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9ec5360dd3ff0d724086a47bd35d554c-config\") pod \"kube-apiserver-proxy-ip-10-0-131-58.ec2.internal\" (UID: \"9ec5360dd3ff0d724086a47bd35d554c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.500477 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.500463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a3f552fe84c3a4f714667d8da3419094-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal\" (UID: \"a3f552fe84c3a4f714667d8da3419094\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.500512 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.500486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3f552fe84c3a4f714667d8da3419094-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal\" (UID: \"a3f552fe84c3a4f714667d8da3419094\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.590739 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.590668 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.651224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.651188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.655688 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.655669 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" Apr 24 21:26:23.691101 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.691070 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.791690 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.791653 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.892303 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:23.892221 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-58.ec2.internal\" not found" Apr 24 21:26:23.949878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.949851 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:23.987645 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.987616 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:23.993372 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.993348 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:23.993483 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.993467 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:23.993539 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.993495 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:23.993539 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.993495 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:23.999300 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:23.999284 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" Apr 24 21:26:24.020851 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.020829 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:24.021655 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.021641 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" Apr 24 21:26:24.030427 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.030413 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:24.071570 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.071553 2575 apiserver.go:52] "Watching apiserver" Apr 24 21:26:24.081023 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.080999 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:24.084075 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.084052 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-g7w5g","openshift-network-diagnostics/network-check-target-8cswd","openshift-network-operator/iptables-alerter-zs4vw","kube-system/konnectivity-agent-58m8w","kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct","openshift-multus/multus-additional-cni-plugins-fhmlv","openshift-multus/network-metrics-daemon-nz6dq","openshift-ovn-kubernetes/ovnkube-node-78cw4","openshift-cluster-node-tuning-operator/tuned-54pzv","openshift-image-registry/node-ca-j8p2r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal"] Apr 24 21:26:24.088602 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.088582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.090780 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.090759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:24.090886 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.090861 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:24.091079 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.091059 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:24.091144 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.091108 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:24.091144 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.091118 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:24.091239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.091199 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:24.091239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.091210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-98czg\"" Apr 24 21:26:24.093109 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.092885 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.093109 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.092968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.094666 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.094650 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:24.095279 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095062 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:24.095279 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095074 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zthhh\"" Apr 24 21:26:24.095279 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.095279 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095121 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qjzxr\"" Apr 24 21:26:24.095512 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095304 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:24.095512 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095328 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:24.095609 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.095561 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:24.096365 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.096350 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:24.097088 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.097067 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:24.097540 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.097519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:24.097642 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.097592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rr9p4\"" Apr 24 21:26:24.097740 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.097726 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:24.099500 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.099482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.099577 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.099564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.099641 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.099623 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:24.101699 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.101679 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gntnh\"" Apr 24 21:26:24.101788 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.101777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.101856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.101790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:24.101856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.101794 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:24.103456 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-kubelet\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.103558 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-hostroot\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.103558 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:24.103558 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-socket-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.103558 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-cni-binary-copy\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-system-cni-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-daemon-config\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103601 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf45q\" (UniqueName: \"kubernetes.io/projected/d0286bfd-3ba9-4b8c-839e-f415766385d0-kube-api-access-rf45q\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-device-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-sys-fs\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-cni-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9cd5c45c-4069-4996-a64a-d57b60694538-host-slash\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-system-cni-dir\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.103847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-cni-multus\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxg2k\" (UniqueName: \"kubernetes.io/projected/f757e1fe-afce-409d-b272-48af3aef88c8-kube-api-access-dxg2k\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9cd5c45c-4069-4996-a64a-d57b60694538-iptables-alerter-script\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-cnibin\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-cnibin\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.103995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-conf-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-etc-kubernetes\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-registration-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9x78\" (UniqueName: \"kubernetes.io/projected/9cd5c45c-4069-4996-a64a-d57b60694538-kube-api-access-c9x78\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104403 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-os-release\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-netns\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.104461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-multus-certs\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5259dcbe-abac-4ee3-bd35-66dab5614ebd-konnectivity-ca\") pod \"konnectivity-agent-58m8w\" (UID: \"5259dcbe-abac-4ee3-bd35-66dab5614ebd\") " pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-etc-selinux\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-k8s-cni-cncf-io\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-cni-bin\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.104949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5259dcbe-abac-4ee3-bd35-66dab5614ebd-agent-certs\") pod \"konnectivity-agent-58m8w\" (UID: \"5259dcbe-abac-4ee3-bd35-66dab5614ebd\") " pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.105124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:24.105554 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn5zq\" (UniqueName: \"kubernetes.io/projected/8de213de-0f31-4d5f-9d53-9ba716ac7760-kube-api-access-sn5zq\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.105554 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr74v\" (UniqueName: \"kubernetes.io/projected/3ad3b358-912b-477a-8fc3-6f2910580c33-kube-api-access-mr74v\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.105554 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-os-release\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.105554 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0286bfd-3ba9-4b8c-839e-f415766385d0-cni-binary-copy\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.105771 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-socket-dir-parent\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.105771 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105649 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lkgpw\"" Apr 24 21:26:24.105771 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105663 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:24.105934 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.105925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:24.106388 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.106118 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:24.106661 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.106539 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:24.106661 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.106570 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2bfqq\"" Apr 24 21:26:24.107638 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.107009 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:24.107780 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.107762 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:24.108342 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.108324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.108892 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.108875 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:24.109727 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.109628 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:23 +0000 UTC" deadline="2027-12-11 12:45:41.880867992 +0000 UTC" Apr 24 21:26:24.109727 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.109689 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14295h19m17.771183288s" Apr 24 21:26:24.110678 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.110658 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:24.110678 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.110677 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:24.110803 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.110723 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rb7nx\"" Apr 24 21:26:24.111296 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.111282 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:24.129001 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.128979 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bdvsc" Apr 24 21:26:24.135079 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.135060 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bdvsc" Apr 24 21:26:24.197763 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.197594 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3f552fe84c3a4f714667d8da3419094.slice/crio-37f1adf58cd9d6b02bcb342cc95f3665a17dd22c56849a07923d39df84597927 WatchSource:0}: Error finding container 37f1adf58cd9d6b02bcb342cc95f3665a17dd22c56849a07923d39df84597927: Status 404 returned error can't find the container with id 37f1adf58cd9d6b02bcb342cc95f3665a17dd22c56849a07923d39df84597927 Apr 24 21:26:24.197949 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.197934 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec5360dd3ff0d724086a47bd35d554c.slice/crio-388a4bb5fac585c47930034bf2d7ba66d16b1151b771fe0434a980bef8238417 WatchSource:0}: Error finding container 388a4bb5fac585c47930034bf2d7ba66d16b1151b771fe0434a980bef8238417: Status 404 returned error can't find the container with id 388a4bb5fac585c47930034bf2d7ba66d16b1151b771fe0434a980bef8238417 Apr 24 21:26:24.199516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.199496 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:24.202135 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.202120 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:24.207372 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-systemd-units\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.207475 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-log-socket\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.207475 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-run-ovn-kubernetes\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.207475 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-netns\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.207636 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.207636 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-systemd\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.207636 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.207636 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-k8s-cni-cncf-io\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.207636 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5259dcbe-abac-4ee3-bd35-66dab5614ebd-agent-certs\") pod \"konnectivity-agent-58m8w\" (UID: \"5259dcbe-abac-4ee3-bd35-66dab5614ebd\") " pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.207636 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn5zq\" (UniqueName: \"kubernetes.io/projected/8de213de-0f31-4d5f-9d53-9ba716ac7760-kube-api-access-sn5zq\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.207910 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-netns\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.207910 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr74v\" (UniqueName: \"kubernetes.io/projected/3ad3b358-912b-477a-8fc3-6f2910580c33-kube-api-access-mr74v\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.207910 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-k8s-cni-cncf-io\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.207910 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-etc-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.207910 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0286bfd-3ba9-4b8c-839e-f415766385d0-cni-binary-copy\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.207910 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-kubelet\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207919 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-daemon-config\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf45q\" (UniqueName: \"kubernetes.io/projected/d0286bfd-3ba9-4b8c-839e-f415766385d0-kube-api-access-rf45q\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.207999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-kubelet\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-device-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovnkube-config\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-device-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-modprobe-d\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysconfig\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-lib-modules\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.208313 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-var-lib-kubelet\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0286bfd-3ba9-4b8c-839e-f415766385d0-cni-binary-copy\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-cni-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-system-cni-dir\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7d6\" (UniqueName: \"kubernetes.io/projected/8d10bb71-bb18-4fc8-8721-420e294ce6ab-kube-api-access-kl7d6\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-kubernetes\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-system-cni-dir\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84jt\" (UniqueName: \"kubernetes.io/projected/9062ddad-735b-4bad-80c5-03e7de6d3add-kube-api-access-x84jt\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-cni-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxg2k\" (UniqueName: \"kubernetes.io/projected/f757e1fe-afce-409d-b272-48af3aef88c8-kube-api-access-dxg2k\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-cni-bin\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-daemon-config\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysctl-d\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-cnibin\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-conf-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.208858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-etc-kubernetes\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-registration-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-conf-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-kubelet\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-cnibin\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.208988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-registration-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-etc-kubernetes\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-cni-netd\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-os-release\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovnkube-script-lib\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209235 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-ovn\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-multus-certs\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5259dcbe-abac-4ee3-bd35-66dab5614ebd-konnectivity-ca\") pod \"konnectivity-agent-58m8w\" (UID: \"5259dcbe-abac-4ee3-bd35-66dab5614ebd\") " pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-etc-selinux\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-run-netns\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.209526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysctl-conf\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftwh6\" (UniqueName: \"kubernetes.io/projected/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-kube-api-access-ftwh6\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-run-multus-certs\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-cni-bin\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-slash\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-env-overrides\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-os-release\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-systemd\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-sys\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9062ddad-735b-4bad-80c5-03e7de6d3add-host\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-os-release\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-socket-dir-parent\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-os-release\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-hostroot\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-cni-bin\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.210281 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-socket-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.209945 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5259dcbe-abac-4ee3-bd35-66dab5614ebd-konnectivity-ca\") pod \"konnectivity-agent-58m8w\" (UID: \"5259dcbe-abac-4ee3-bd35-66dab5614ebd\") " pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-cni-binary-copy\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-etc-selinux\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-hostroot\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-host\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-socket-dir\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-multus-socket-dir-parent\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8de213de-0f31-4d5f-9d53-9ba716ac7760-cni-binary-copy\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-system-cni-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-system-cni-dir\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-sys-fs\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.210998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-var-lib-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.211035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-node-log\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-tuned\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9cd5c45c-4069-4996-a64a-d57b60694538-host-slash\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211098 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f757e1fe-afce-409d-b272-48af3aef88c8-sys-fs\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-run\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-cni-multus\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9cd5c45c-4069-4996-a64a-d57b60694538-host-slash\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5259dcbe-abac-4ee3-bd35-66dab5614ebd-agent-certs\") pod \"konnectivity-agent-58m8w\" (UID: \"5259dcbe-abac-4ee3-bd35-66dab5614ebd\") " pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9cd5c45c-4069-4996-a64a-d57b60694538-iptables-alerter-script\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.211270 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-cnibin\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovn-node-metrics-cert\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.211379 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:24.711337411 +0000 UTC m=+2.114600405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d0286bfd-3ba9-4b8c-839e-f415766385d0-host-var-lib-cni-multus\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9062ddad-735b-4bad-80c5-03e7de6d3add-serviceca\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9x78\" (UniqueName: \"kubernetes.io/projected/9cd5c45c-4069-4996-a64a-d57b60694538-kube-api-access-c9x78\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.211738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-tmp\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.212280 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9cd5c45c-4069-4996-a64a-d57b60694538-iptables-alerter-script\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.212280 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.211800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8de213de-0f31-4d5f-9d53-9ba716ac7760-cnibin\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.217209 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.217166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" event={"ID":"a3f552fe84c3a4f714667d8da3419094","Type":"ContainerStarted","Data":"37f1adf58cd9d6b02bcb342cc95f3665a17dd22c56849a07923d39df84597927"} Apr 24 21:26:24.220965 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.218364 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:24.220965 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.218400 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:24.220965 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.218443 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xc7kv for pod openshift-network-diagnostics/network-check-target-8cswd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:24.220965 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.218514 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv podName:42ee90c2-09f5-4464-a75c-62352a375c5a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:24.71849026 +0000 UTC m=+2.121753234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xc7kv" (UniqueName: "kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv") pod "network-check-target-8cswd" (UID: "42ee90c2-09f5-4464-a75c-62352a375c5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:24.220965 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.219073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" event={"ID":"9ec5360dd3ff0d724086a47bd35d554c","Type":"ContainerStarted","Data":"388a4bb5fac585c47930034bf2d7ba66d16b1151b771fe0434a980bef8238417"} Apr 24 21:26:24.222011 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.221979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf45q\" (UniqueName: \"kubernetes.io/projected/d0286bfd-3ba9-4b8c-839e-f415766385d0-kube-api-access-rf45q\") pod \"multus-g7w5g\" (UID: \"d0286bfd-3ba9-4b8c-839e-f415766385d0\") " pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.223239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.223034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn5zq\" (UniqueName: \"kubernetes.io/projected/8de213de-0f31-4d5f-9d53-9ba716ac7760-kube-api-access-sn5zq\") pod \"multus-additional-cni-plugins-fhmlv\" (UID: \"8de213de-0f31-4d5f-9d53-9ba716ac7760\") " pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.223357 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.223295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr74v\" (UniqueName: \"kubernetes.io/projected/3ad3b358-912b-477a-8fc3-6f2910580c33-kube-api-access-mr74v\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.223466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.223448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9x78\" (UniqueName: \"kubernetes.io/projected/9cd5c45c-4069-4996-a64a-d57b60694538-kube-api-access-c9x78\") pod \"iptables-alerter-zs4vw\" (UID: \"9cd5c45c-4069-4996-a64a-d57b60694538\") " pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.223521 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.223480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxg2k\" (UniqueName: \"kubernetes.io/projected/f757e1fe-afce-409d-b272-48af3aef88c8-kube-api-access-dxg2k\") pod \"aws-ebs-csi-driver-node-wx9ct\" (UID: \"f757e1fe-afce-409d-b272-48af3aef88c8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.231201 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.231184 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:24.312118 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovn-node-metrics-cert\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312118 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9062ddad-735b-4bad-80c5-03e7de6d3add-serviceca\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-tmp\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-systemd-units\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-log-socket\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-systemd-units\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-run-ovn-kubernetes\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-run-ovn-kubernetes\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-systemd\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-log-socket\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312390 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-etc-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-etc-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovnkube-config\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-systemd\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-modprobe-d\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysconfig\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312559 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-modprobe-d\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysconfig\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-lib-modules\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9062ddad-735b-4bad-80c5-03e7de6d3add-serviceca\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-var-lib-kubelet\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7d6\" (UniqueName: \"kubernetes.io/projected/8d10bb71-bb18-4fc8-8721-420e294ce6ab-kube-api-access-kl7d6\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-lib-modules\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-var-lib-kubelet\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-kubernetes\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x84jt\" (UniqueName: \"kubernetes.io/projected/9062ddad-735b-4bad-80c5-03e7de6d3add-kube-api-access-x84jt\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-cni-bin\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysctl-d\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.312878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-kubernetes\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-kubelet\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-kubelet\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-cni-netd\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-cni-bin\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovnkube-script-lib\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-cni-netd\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-ovn\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-run-netns\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysctl-d\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysctl-conf\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftwh6\" (UniqueName: \"kubernetes.io/projected/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-kube-api-access-ftwh6\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovnkube-config\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-ovn\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-slash\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-env-overrides\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-slash\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.312985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-host-run-netns\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.313787 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-systemd\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-sysctl-conf\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-sys\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-systemd\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9062ddad-735b-4bad-80c5-03e7de6d3add-host\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-host\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313172 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-sys\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9062ddad-735b-4bad-80c5-03e7de6d3add-host\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-host\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-var-lib-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-var-lib-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-node-log\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovnkube-script-lib\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-tuned\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-run\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8d10bb71-bb18-4fc8-8721-420e294ce6ab-env-overrides\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-node-log\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.314607 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8d10bb71-bb18-4fc8-8721-420e294ce6ab-run-openvswitch\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.315123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.313500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-run\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.315123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.314785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-tmp\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.315123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.314790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d10bb71-bb18-4fc8-8721-420e294ce6ab-ovn-node-metrics-cert\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.315211 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.315125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-etc-tuned\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.320663 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.320635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftwh6\" (UniqueName: \"kubernetes.io/projected/e9ea7b9c-f49c-4584-aabf-ed26a2c488b9-kube-api-access-ftwh6\") pod \"tuned-54pzv\" (UID: \"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9\") " pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.320850 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.320829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84jt\" (UniqueName: \"kubernetes.io/projected/9062ddad-735b-4bad-80c5-03e7de6d3add-kube-api-access-x84jt\") pod \"node-ca-j8p2r\" (UID: \"9062ddad-735b-4bad-80c5-03e7de6d3add\") " pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.321445 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.321429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7d6\" (UniqueName: \"kubernetes.io/projected/8d10bb71-bb18-4fc8-8721-420e294ce6ab-kube-api-access-kl7d6\") pod \"ovnkube-node-78cw4\" (UID: \"8d10bb71-bb18-4fc8-8721-420e294ce6ab\") " pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.419702 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.419613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g7w5g" Apr 24 21:26:24.426618 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.426589 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0286bfd_3ba9_4b8c_839e_f415766385d0.slice/crio-072e3852ba9eb832c88a7ada5f9f1468cbaaf3756d11c6e064eedac0a49e8745 WatchSource:0}: Error finding container 072e3852ba9eb832c88a7ada5f9f1468cbaaf3756d11c6e064eedac0a49e8745: Status 404 returned error can't find the container with id 072e3852ba9eb832c88a7ada5f9f1468cbaaf3756d11c6e064eedac0a49e8745 Apr 24 21:26:24.440086 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.440067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zs4vw" Apr 24 21:26:24.445368 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.445341 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cd5c45c_4069_4996_a64a_d57b60694538.slice/crio-972c1887aa1ac9760af9ebfb61e29a3fcbc18502c9381d2d3c969aa55425fccc WatchSource:0}: Error finding container 972c1887aa1ac9760af9ebfb61e29a3fcbc18502c9381d2d3c969aa55425fccc: Status 404 returned error can't find the container with id 972c1887aa1ac9760af9ebfb61e29a3fcbc18502c9381d2d3c969aa55425fccc Apr 24 21:26:24.453666 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.453649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:24.458175 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.458157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" Apr 24 21:26:24.459817 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.459797 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5259dcbe_abac_4ee3_bd35_66dab5614ebd.slice/crio-ab58f905a5d6b282c33336b9c2062bb92938ca11fd305e78175a0c352c129773 WatchSource:0}: Error finding container ab58f905a5d6b282c33336b9c2062bb92938ca11fd305e78175a0c352c129773: Status 404 returned error can't find the container with id ab58f905a5d6b282c33336b9c2062bb92938ca11fd305e78175a0c352c129773 Apr 24 21:26:24.464988 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.464970 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf757e1fe_afce_409d_b272_48af3aef88c8.slice/crio-a3ad0f8282c418e71ac64fcf57f119ad763ef3c0abd6b249ebc42d21e3d0f3c0 WatchSource:0}: Error finding container a3ad0f8282c418e71ac64fcf57f119ad763ef3c0abd6b249ebc42d21e3d0f3c0: Status 404 returned error can't find the container with id a3ad0f8282c418e71ac64fcf57f119ad763ef3c0abd6b249ebc42d21e3d0f3c0 Apr 24 21:26:24.475077 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.475061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" Apr 24 21:26:24.480455 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.480429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de213de_0f31_4d5f_9d53_9ba716ac7760.slice/crio-7ad6e1b1b84cb99c1e2cb195571a49322f2420fd891845c4263075375ca18969 WatchSource:0}: Error finding container 7ad6e1b1b84cb99c1e2cb195571a49322f2420fd891845c4263075375ca18969: Status 404 returned error can't find the container with id 7ad6e1b1b84cb99c1e2cb195571a49322f2420fd891845c4263075375ca18969 Apr 24 21:26:24.481468 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.481453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:24.487287 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.487251 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d10bb71_bb18_4fc8_8721_420e294ce6ab.slice/crio-d83d02fad6ad36e20363c39a87ca919218dd8ab5bca7f14a1d3679eee4384a3b WatchSource:0}: Error finding container d83d02fad6ad36e20363c39a87ca919218dd8ab5bca7f14a1d3679eee4384a3b: Status 404 returned error can't find the container with id d83d02fad6ad36e20363c39a87ca919218dd8ab5bca7f14a1d3679eee4384a3b Apr 24 21:26:24.505379 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.505359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-54pzv" Apr 24 21:26:24.510754 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.510735 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ea7b9c_f49c_4584_aabf_ed26a2c488b9.slice/crio-5e6e553e0b5865aaadcb623791740a45cc439f59df809700a7803f0afa3f7f3d WatchSource:0}: Error finding container 5e6e553e0b5865aaadcb623791740a45cc439f59df809700a7803f0afa3f7f3d: Status 404 returned error can't find the container with id 5e6e553e0b5865aaadcb623791740a45cc439f59df809700a7803f0afa3f7f3d Apr 24 21:26:24.511242 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.511226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j8p2r" Apr 24 21:26:24.517484 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:24.517467 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9062ddad_735b_4bad_80c5_03e7de6d3add.slice/crio-c94606cbad5c75264e9cd288dfe194da693774a4decd6244886e229f862eaf1c WatchSource:0}: Error finding container c94606cbad5c75264e9cd288dfe194da693774a4decd6244886e229f862eaf1c: Status 404 returned error can't find the container with id c94606cbad5c75264e9cd288dfe194da693774a4decd6244886e229f862eaf1c Apr 24 21:26:24.717681 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.717602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:24.717870 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.717772 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:24.717870 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.717845 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:25.717824371 +0000 UTC m=+3.121087348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:24.818410 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:24.818285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:24.818593 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.818458 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:24.818593 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.818480 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:24.818593 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.818492 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xc7kv for pod openshift-network-diagnostics/network-check-target-8cswd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:24.818593 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:24.818551 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv podName:42ee90c2-09f5-4464-a75c-62352a375c5a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:25.818532252 +0000 UTC m=+3.221795238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xc7kv" (UniqueName: "kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv") pod "network-check-target-8cswd" (UID: "42ee90c2-09f5-4464-a75c-62352a375c5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:25.047527 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.047454 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:25.136183 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.136101 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:24 +0000 UTC" deadline="2027-12-27 04:52:09.721037069 +0000 UTC" Apr 24 21:26:25.136183 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.136132 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14671h25m44.584908714s" Apr 24 21:26:25.244463 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.244424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"d83d02fad6ad36e20363c39a87ca919218dd8ab5bca7f14a1d3679eee4384a3b"} Apr 24 21:26:25.258976 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.258942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-58m8w" event={"ID":"5259dcbe-abac-4ee3-bd35-66dab5614ebd","Type":"ContainerStarted","Data":"ab58f905a5d6b282c33336b9c2062bb92938ca11fd305e78175a0c352c129773"} Apr 24 21:26:25.264115 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.264089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g7w5g" event={"ID":"d0286bfd-3ba9-4b8c-839e-f415766385d0","Type":"ContainerStarted","Data":"072e3852ba9eb832c88a7ada5f9f1468cbaaf3756d11c6e064eedac0a49e8745"} Apr 24 21:26:25.275479 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.275448 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j8p2r" event={"ID":"9062ddad-735b-4bad-80c5-03e7de6d3add","Type":"ContainerStarted","Data":"c94606cbad5c75264e9cd288dfe194da693774a4decd6244886e229f862eaf1c"} Apr 24 21:26:25.291936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.291905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-54pzv" event={"ID":"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9","Type":"ContainerStarted","Data":"5e6e553e0b5865aaadcb623791740a45cc439f59df809700a7803f0afa3f7f3d"} Apr 24 21:26:25.301811 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.301747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerStarted","Data":"7ad6e1b1b84cb99c1e2cb195571a49322f2420fd891845c4263075375ca18969"} Apr 24 21:26:25.317343 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.317314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" event={"ID":"f757e1fe-afce-409d-b272-48af3aef88c8","Type":"ContainerStarted","Data":"a3ad0f8282c418e71ac64fcf57f119ad763ef3c0abd6b249ebc42d21e3d0f3c0"} Apr 24 21:26:25.338169 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.338135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zs4vw" event={"ID":"9cd5c45c-4069-4996-a64a-d57b60694538","Type":"ContainerStarted","Data":"972c1887aa1ac9760af9ebfb61e29a3fcbc18502c9381d2d3c969aa55425fccc"} Apr 24 21:26:25.727578 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.725873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:25.727578 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:25.726076 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:25.727578 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:25.726140 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:27.726119378 +0000 UTC m=+5.129382351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:25.829123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:25.829085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:25.829345 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:25.829279 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:25.829345 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:25.829301 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:25.829345 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:25.829327 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xc7kv for pod openshift-network-diagnostics/network-check-target-8cswd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:25.829535 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:25.829388 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv podName:42ee90c2-09f5-4464-a75c-62352a375c5a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:27.829366529 +0000 UTC m=+5.232629502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xc7kv" (UniqueName: "kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv") pod "network-check-target-8cswd" (UID: "42ee90c2-09f5-4464-a75c-62352a375c5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:26.136684 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:26.136594 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:24 +0000 UTC" deadline="2027-10-02 10:56:23.711414771 +0000 UTC" Apr 24 21:26:26.136684 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:26.136631 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12613h29m57.574787214s" Apr 24 21:26:26.215268 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:26.215228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:26.215445 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:26.215366 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:26.215529 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:26.215480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:26.215583 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:26.215556 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:26.658141 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:26.658092 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:26.748351 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:26.748322 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:27.744184 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:27.744145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:27.744716 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:27.744300 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:27.744716 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:27.744365 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.74434429 +0000 UTC m=+9.147607264 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:27.844794 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:27.844755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:27.844981 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:27.844923 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:27.844981 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:27.844941 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:27.844981 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:27.844954 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xc7kv for pod openshift-network-diagnostics/network-check-target-8cswd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:27.845138 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:27.845010 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv podName:42ee90c2-09f5-4464-a75c-62352a375c5a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.844991945 +0000 UTC m=+9.248254924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xc7kv" (UniqueName: "kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv") pod "network-check-target-8cswd" (UID: "42ee90c2-09f5-4464-a75c-62352a375c5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:28.038460 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.037711 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dlzdh"] Apr 24 21:26:28.042770 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.042350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.042770 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.042430 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:28.147465 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.147280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d9049978-94b6-422d-bab2-7c826163ffc7-kubelet-config\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.147465 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.147330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d9049978-94b6-422d-bab2-7c826163ffc7-dbus\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.147465 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.147380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.215453 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.215386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:28.215649 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.215525 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:28.215878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.215798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:28.216031 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.215910 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:28.247931 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.247897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d9049978-94b6-422d-bab2-7c826163ffc7-dbus\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.248094 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.247971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.248094 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.248010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d9049978-94b6-422d-bab2-7c826163ffc7-kubelet-config\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.248230 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.248113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d9049978-94b6-422d-bab2-7c826163ffc7-kubelet-config\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.248333 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.248250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d9049978-94b6-422d-bab2-7c826163ffc7-dbus\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.248394 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.248360 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:28.248445 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.248415 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret podName:d9049978-94b6-422d-bab2-7c826163ffc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:28.748393063 +0000 UTC m=+6.151656037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret") pod "global-pull-secret-syncer-dlzdh" (UID: "d9049978-94b6-422d-bab2-7c826163ffc7") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:28.753398 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:28.753360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:28.753858 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.753502 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:28.753858 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:28.753563 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret podName:d9049978-94b6-422d-bab2-7c826163ffc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:29.753545038 +0000 UTC m=+7.156808015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret") pod "global-pull-secret-syncer-dlzdh" (UID: "d9049978-94b6-422d-bab2-7c826163ffc7") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:29.760605 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:29.760568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:29.761042 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:29.760732 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:29.761042 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:29.760806 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret podName:d9049978-94b6-422d-bab2-7c826163ffc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.760785624 +0000 UTC m=+9.164048610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret") pod "global-pull-secret-syncer-dlzdh" (UID: "d9049978-94b6-422d-bab2-7c826163ffc7") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:30.215581 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:30.215497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:30.215727 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:30.215663 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:30.215795 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:30.215759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:30.215875 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:30.215845 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:30.215931 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:30.215906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:30.216067 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:30.216048 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:31.777887 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:31.777639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:31.777887 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:31.777694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:31.777887 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.777803 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:31.777887 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.777829 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:31.777887 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.777872 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:39.777852342 +0000 UTC m=+17.181115325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:31.777887 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.777891 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret podName:d9049978-94b6-422d-bab2-7c826163ffc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:35.777882013 +0000 UTC m=+13.181144988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret") pod "global-pull-secret-syncer-dlzdh" (UID: "d9049978-94b6-422d-bab2-7c826163ffc7") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:31.878432 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:31.878388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:31.878591 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.878572 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:31.878646 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.878600 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:31.878646 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.878613 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xc7kv for pod openshift-network-diagnostics/network-check-target-8cswd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:31.878739 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:31.878676 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv podName:42ee90c2-09f5-4464-a75c-62352a375c5a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:39.878659113 +0000 UTC m=+17.281922107 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xc7kv" (UniqueName: "kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv") pod "network-check-target-8cswd" (UID: "42ee90c2-09f5-4464-a75c-62352a375c5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:32.215570 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:32.215016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:32.215570 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:32.215035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:32.215570 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:32.215016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:32.215570 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:32.215150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:32.215570 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:32.215228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:32.215570 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:32.215330 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:33.635243 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.635213 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-r2v97"] Apr 24 21:26:33.638251 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.638219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.641677 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.641657 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:33.642197 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.642173 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84r2g\"" Apr 24 21:26:33.642781 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.642766 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:33.692190 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.692157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92515838-c368-40ef-9e3e-40f753dd0308-hosts-file\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.692361 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.692207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndfl9\" (UniqueName: \"kubernetes.io/projected/92515838-c368-40ef-9e3e-40f753dd0308-kube-api-access-ndfl9\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.692361 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.692326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92515838-c368-40ef-9e3e-40f753dd0308-tmp-dir\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.793480 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.793449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndfl9\" (UniqueName: \"kubernetes.io/projected/92515838-c368-40ef-9e3e-40f753dd0308-kube-api-access-ndfl9\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.793646 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.793557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92515838-c368-40ef-9e3e-40f753dd0308-tmp-dir\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.793646 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.793584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92515838-c368-40ef-9e3e-40f753dd0308-hosts-file\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.793747 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.793661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92515838-c368-40ef-9e3e-40f753dd0308-hosts-file\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.793893 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.793876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92515838-c368-40ef-9e3e-40f753dd0308-tmp-dir\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.803439 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.803400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndfl9\" (UniqueName: \"kubernetes.io/projected/92515838-c368-40ef-9e3e-40f753dd0308-kube-api-access-ndfl9\") pod \"node-resolver-r2v97\" (UID: \"92515838-c368-40ef-9e3e-40f753dd0308\") " pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:33.949697 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:33.949577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r2v97" Apr 24 21:26:34.215652 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:34.215560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:34.215652 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:34.215595 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:34.215652 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:34.215603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:34.215899 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:34.215682 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:34.215899 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:34.215795 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:34.215899 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:34.215880 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:35.806143 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:35.806106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:35.806566 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:35.806245 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:35.806566 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:35.806323 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret podName:d9049978-94b6-422d-bab2-7c826163ffc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:43.806308197 +0000 UTC m=+21.209571181 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret") pod "global-pull-secret-syncer-dlzdh" (UID: "d9049978-94b6-422d-bab2-7c826163ffc7") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:36.214648 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:36.214569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:36.214815 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:36.214569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:36.214815 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:36.214713 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:36.214815 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:36.214569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:36.214815 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:36.214778 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:36.215023 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:36.214836 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:38.214915 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:38.214876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:38.215349 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:38.214876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:38.215349 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:38.214984 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:38.215349 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:38.214885 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:38.215349 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:38.215078 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:38.215349 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:38.215182 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:39.837956 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:39.837919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:39.838476 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:39.838105 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:39.838476 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:39.838189 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.838167953 +0000 UTC m=+33.241430939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:39.938458 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:39.938419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:39.938643 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:39.938560 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:39.938643 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:39.938576 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:39.938643 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:39.938586 2575 projected.go:194] Error preparing data for projected volume kube-api-access-xc7kv for pod openshift-network-diagnostics/network-check-target-8cswd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:39.938643 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:39.938640 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv podName:42ee90c2-09f5-4464-a75c-62352a375c5a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.938622466 +0000 UTC m=+33.341885453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xc7kv" (UniqueName: "kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv") pod "network-check-target-8cswd" (UID: "42ee90c2-09f5-4464-a75c-62352a375c5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:40.214680 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:40.214596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:40.214680 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:40.214619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:40.214998 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:40.214606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:40.214998 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:40.214740 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:40.214998 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:40.214808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:40.214998 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:40.214900 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:42.215274 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.215107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:42.215742 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.215106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:42.215742 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.215107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:42.215742 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:42.215474 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:42.215742 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:42.215362 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:42.215742 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:42.215513 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:42.374748 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.374316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-54pzv" event={"ID":"e9ea7b9c-f49c-4584-aabf-ed26a2c488b9","Type":"ContainerStarted","Data":"c1364e1fdce8068d943d64073b2880c3eb78823c5cdf6638442518658359fe3d"} Apr 24 21:26:42.377709 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.377679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r2v97" event={"ID":"92515838-c368-40ef-9e3e-40f753dd0308","Type":"ContainerStarted","Data":"5553bbfc886823dab57d9a2fa6c1b735e9b104b7d363d65cd2091bdf643c2c31"} Apr 24 21:26:42.379682 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.379653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" event={"ID":"9ec5360dd3ff0d724086a47bd35d554c","Type":"ContainerStarted","Data":"e3e60a32b0243bae709f2734bbfa75834e1dab09c9aa444d0696168bb72f5792"} Apr 24 21:26:42.384012 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.383943 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:26:42.384437 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.384355 2575 generic.go:358] "Generic (PLEG): container finished" podID="8d10bb71-bb18-4fc8-8721-420e294ce6ab" containerID="92603f9e19db0ec5b8b1f4494ecbbacb934e9d72123f0f731bfda5b46eeda03b" exitCode=1 Apr 24 21:26:42.384527 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.384434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"f2ad62551a84da82af7c5ac9e2687f317cc38afe3e0a75b305a6cdbf259c9ee8"} Apr 24 21:26:42.384527 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.384461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"5a3db706cbf16addcc1af1dd4f44f84cc66d173fadbcf10516ba9ec8d7776d8b"} Apr 24 21:26:42.384527 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.384484 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerDied","Data":"92603f9e19db0ec5b8b1f4494ecbbacb934e9d72123f0f731bfda5b46eeda03b"} Apr 24 21:26:42.384527 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.384501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"3b31529c61395791b9ff68b0f03a3294e43ea06175f139654224aa54d5217b3f"} Apr 24 21:26:42.386279 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.386232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g7w5g" event={"ID":"d0286bfd-3ba9-4b8c-839e-f415766385d0","Type":"ContainerStarted","Data":"79bea940e7247a515a6af75080abbc91226f79c76ad22e8d4e9aad1bab74857f"} Apr 24 21:26:42.389732 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.389682 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-54pzv" podStartSLOduration=1.9740107770000002 podStartE2EDuration="19.389668541s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.51216772 +0000 UTC m=+1.915430694" lastFinishedPulling="2026-04-24 21:26:41.927825482 +0000 UTC m=+19.331088458" observedRunningTime="2026-04-24 21:26:42.389353137 +0000 UTC m=+19.792616131" watchObservedRunningTime="2026-04-24 21:26:42.389668541 +0000 UTC m=+19.792931535" Apr 24 21:26:42.402146 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.402087 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-58.ec2.internal" podStartSLOduration=18.402072341 podStartE2EDuration="18.402072341s" podCreationTimestamp="2026-04-24 21:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:42.401499887 +0000 UTC m=+19.804762880" watchObservedRunningTime="2026-04-24 21:26:42.402072341 +0000 UTC m=+19.805335335" Apr 24 21:26:42.420335 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:42.420064 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g7w5g" podStartSLOduration=1.895937607 podStartE2EDuration="19.420046705s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.428113722 +0000 UTC m=+1.831376696" lastFinishedPulling="2026-04-24 21:26:41.952222816 +0000 UTC m=+19.355485794" observedRunningTime="2026-04-24 21:26:42.41923807 +0000 UTC m=+19.822501062" watchObservedRunningTime="2026-04-24 21:26:42.420046705 +0000 UTC m=+19.823309699" Apr 24 21:26:43.389293 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.389231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zs4vw" event={"ID":"9cd5c45c-4069-4996-a64a-d57b60694538","Type":"ContainerStarted","Data":"d81ce233af538942b597695e1dae6ab26c5a55fde1ec8b0c25ae18f25a5e7c05"} Apr 24 21:26:43.390436 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.390413 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r2v97" event={"ID":"92515838-c368-40ef-9e3e-40f753dd0308","Type":"ContainerStarted","Data":"6cdf16c60965835e1a9248f400072968e1d39d159f76dd32d662cf684384bbcc"} Apr 24 21:26:43.391705 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.391684 2575 generic.go:358] "Generic (PLEG): container finished" podID="a3f552fe84c3a4f714667d8da3419094" containerID="872ba84bbf3f16f8c95aa4b9ef496e1d7bad70f5f4e23436a6f83205446e6d3d" exitCode=0 Apr 24 21:26:43.391800 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.391749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" event={"ID":"a3f552fe84c3a4f714667d8da3419094","Type":"ContainerDied","Data":"872ba84bbf3f16f8c95aa4b9ef496e1d7bad70f5f4e23436a6f83205446e6d3d"} Apr 24 21:26:43.394213 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.394182 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:26:43.394597 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.394572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"25871b90438f49afeb99453e5d5e013da3c8e27b5196cd0c6e986a417599a1c2"} Apr 24 21:26:43.394696 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.394603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"d49b728fe33162e28bbebb4e0e37bea57201c0a1b4cf77f4e6a4ac92be7377bf"} Apr 24 21:26:43.395879 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.395850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-58m8w" event={"ID":"5259dcbe-abac-4ee3-bd35-66dab5614ebd","Type":"ContainerStarted","Data":"19ae679bf72e7addf36dd1ac113812b2b0c500c7755011c00fa02953121a962c"} Apr 24 21:26:43.397082 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.397055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j8p2r" event={"ID":"9062ddad-735b-4bad-80c5-03e7de6d3add","Type":"ContainerStarted","Data":"6d7d1636dd15fa680ae39c74dcabd776d0f02137fce16143520a1fd35da0fbbc"} Apr 24 21:26:43.398334 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.398315 2575 generic.go:358] "Generic (PLEG): container finished" podID="8de213de-0f31-4d5f-9d53-9ba716ac7760" containerID="63466b93c0f821e82abf5d4701538582bcb08dc5a84f6600ee340f8f8f7cf358" exitCode=0 Apr 24 21:26:43.398419 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.398369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerDied","Data":"63466b93c0f821e82abf5d4701538582bcb08dc5a84f6600ee340f8f8f7cf358"} Apr 24 21:26:43.399838 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.399818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" event={"ID":"f757e1fe-afce-409d-b272-48af3aef88c8","Type":"ContainerStarted","Data":"fd74c0b7b8e7d72490f24ae20d83b2dad50ab43839cc2b895e00f6e045a2bce9"} Apr 24 21:26:43.402359 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.402324 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zs4vw" podStartSLOduration=2.92136249 podStartE2EDuration="20.402313518s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.446730608 +0000 UTC m=+1.849993582" lastFinishedPulling="2026-04-24 21:26:41.927681626 +0000 UTC m=+19.330944610" observedRunningTime="2026-04-24 21:26:43.40151241 +0000 UTC m=+20.804775405" watchObservedRunningTime="2026-04-24 21:26:43.402313518 +0000 UTC m=+20.805576510" Apr 24 21:26:43.430892 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.430851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-58m8w" podStartSLOduration=3.015270297 podStartE2EDuration="20.430837357s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.462027173 +0000 UTC m=+1.865290148" lastFinishedPulling="2026-04-24 21:26:41.877594227 +0000 UTC m=+19.280857208" observedRunningTime="2026-04-24 21:26:43.430727205 +0000 UTC m=+20.833990200" watchObservedRunningTime="2026-04-24 21:26:43.430837357 +0000 UTC m=+20.834100360" Apr 24 21:26:43.443779 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.443718 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j8p2r" podStartSLOduration=3.085243882 podStartE2EDuration="20.443706128s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.519128736 +0000 UTC m=+1.922391710" lastFinishedPulling="2026-04-24 21:26:41.877590977 +0000 UTC m=+19.280853956" observedRunningTime="2026-04-24 21:26:43.443369416 +0000 UTC m=+20.846632423" watchObservedRunningTime="2026-04-24 21:26:43.443706128 +0000 UTC m=+20.846969122" Apr 24 21:26:43.456913 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.456866 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r2v97" podStartSLOduration=10.456848752 podStartE2EDuration="10.456848752s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:43.456283456 +0000 UTC m=+20.859546444" watchObservedRunningTime="2026-04-24 21:26:43.456848752 +0000 UTC m=+20.860111746" Apr 24 21:26:43.539410 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.539198 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:26:43.868115 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:43.868033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:43.868377 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:43.868191 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:43.868377 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:43.868280 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret podName:d9049978-94b6-422d-bab2-7c826163ffc7 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:59.868245085 +0000 UTC m=+37.271508056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret") pod "global-pull-secret-syncer-dlzdh" (UID: "d9049978-94b6-422d-bab2-7c826163ffc7") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:44.176741 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.176619 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:26:43.539405107Z","UUID":"aa980f81-f4e0-4792-a500-373ade7eab6f","Handler":null,"Name":"","Endpoint":""} Apr 24 21:26:44.178481 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.178451 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:26:44.178481 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.178480 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:26:44.215414 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.215388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:44.215559 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.215392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:44.215559 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:44.215493 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:44.215678 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:44.215602 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:44.215678 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.215393 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:44.215780 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:44.215700 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:44.403867 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.403824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" event={"ID":"f757e1fe-afce-409d-b272-48af3aef88c8","Type":"ContainerStarted","Data":"291fd74b71b5cdfde0021fbb4357c3d5153dde91686aff14645d6e996c80bf9f"} Apr 24 21:26:44.405752 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.405540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" event={"ID":"a3f552fe84c3a4f714667d8da3419094","Type":"ContainerStarted","Data":"bf1a332121e9fe858fb8df9f876d0a4bc038b1c8acb0c750119f006d6dfa3e41"} Apr 24 21:26:44.418419 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:44.418368 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-58.ec2.internal" podStartSLOduration=20.418352992 podStartE2EDuration="20.418352992s" podCreationTimestamp="2026-04-24 21:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:44.418280978 +0000 UTC m=+21.821543974" watchObservedRunningTime="2026-04-24 21:26:44.418352992 +0000 UTC m=+21.821615987" Apr 24 21:26:45.408850 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:45.408821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" event={"ID":"f757e1fe-afce-409d-b272-48af3aef88c8","Type":"ContainerStarted","Data":"cfad3e618fc6089a50ecee477559ab0ab43f31a4757e9ffacf05596b00d01fdf"} Apr 24 21:26:45.411424 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:45.411404 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:26:45.411759 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:45.411738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"887e394649e8270f22d5700d4161ef0a4d0fd468475a6e3c273930a3edd7c189"} Apr 24 21:26:45.437984 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:45.437908 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wx9ct" podStartSLOduration=2.604138863 podStartE2EDuration="22.437895754s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.466813699 +0000 UTC m=+1.870076672" lastFinishedPulling="2026-04-24 21:26:44.300570577 +0000 UTC m=+21.703833563" observedRunningTime="2026-04-24 21:26:45.437682915 +0000 UTC m=+22.840945921" watchObservedRunningTime="2026-04-24 21:26:45.437895754 +0000 UTC m=+22.841158747" Apr 24 21:26:46.214798 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:46.214755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:46.214968 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:46.214831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:46.214968 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:46.214860 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:46.215055 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:46.214973 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:46.215055 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:46.215002 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:46.215127 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:46.215071 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:47.203974 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:47.203942 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:47.204963 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:47.204941 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:47.994434 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:47.994175 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:47.994700 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:47.994685 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-58m8w" Apr 24 21:26:48.215614 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.215594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:48.215974 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.215629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:48.215974 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.215627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:48.215974 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:48.215735 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:48.215974 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:48.215825 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:48.215974 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:48.215908 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:48.418453 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.418426 2575 generic.go:358] "Generic (PLEG): container finished" podID="8de213de-0f31-4d5f-9d53-9ba716ac7760" containerID="00805ab16618a5ec60fc37c796b08398dfd4bf64f28585ad82575bdba3819d2c" exitCode=0 Apr 24 21:26:48.418554 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.418518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerDied","Data":"00805ab16618a5ec60fc37c796b08398dfd4bf64f28585ad82575bdba3819d2c"} Apr 24 21:26:48.421657 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.421637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:26:48.421989 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.421964 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"0f752e4ebced88ba54416f335dd707dbd9fd915e9daa88895884578407b1d4f5"} Apr 24 21:26:48.422241 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.422219 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:48.422334 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.422253 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:48.422334 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.422282 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:48.422334 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.422326 2575 scope.go:117] "RemoveContainer" containerID="92603f9e19db0ec5b8b1f4494ecbbacb934e9d72123f0f731bfda5b46eeda03b" Apr 24 21:26:48.436690 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.436668 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:48.436858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:48.436843 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:26:49.426600 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.426577 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:26:49.426990 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.426926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" event={"ID":"8d10bb71-bb18-4fc8-8721-420e294ce6ab","Type":"ContainerStarted","Data":"3237f8d22d74485a82e1fab0e2dc8f0d466a5b1d8f3f88b289a75824070398e6"} Apr 24 21:26:49.457670 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.457626 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" podStartSLOduration=8.891111787 podStartE2EDuration="26.457611529s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.488906915 +0000 UTC m=+1.892169889" lastFinishedPulling="2026-04-24 21:26:42.055406643 +0000 UTC m=+19.458669631" observedRunningTime="2026-04-24 21:26:49.457268556 +0000 UTC m=+26.860531575" watchObservedRunningTime="2026-04-24 21:26:49.457611529 +0000 UTC m=+26.860874522" Apr 24 21:26:49.777107 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.777080 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dlzdh"] Apr 24 21:26:49.777228 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.777195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:49.777329 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:49.777303 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:49.780020 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.779997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8cswd"] Apr 24 21:26:49.780149 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.780114 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:49.780230 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:49.780207 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:49.782724 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.782703 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nz6dq"] Apr 24 21:26:49.782826 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:49.782813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:49.782958 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:49.782923 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:50.430740 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:50.430706 2575 generic.go:358] "Generic (PLEG): container finished" podID="8de213de-0f31-4d5f-9d53-9ba716ac7760" containerID="de69708a8a34b91bae685aa4a6094a766b49e8786e11f04a5f2259f983365a8c" exitCode=0 Apr 24 21:26:50.431115 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:50.430795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerDied","Data":"de69708a8a34b91bae685aa4a6094a766b49e8786e11f04a5f2259f983365a8c"} Apr 24 21:26:51.215113 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:51.215059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:51.215300 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:51.215176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:51.215300 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:51.215195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:51.215300 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:51.215193 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:51.215300 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:51.215272 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:51.215515 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:51.215376 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:51.434428 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:51.434388 2575 generic.go:358] "Generic (PLEG): container finished" podID="8de213de-0f31-4d5f-9d53-9ba716ac7760" containerID="84a07faa41b089a9f462b83bcc779929914cfe3a97efe66ceb6bc65fb9edca77" exitCode=0 Apr 24 21:26:51.435092 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:51.434452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerDied","Data":"84a07faa41b089a9f462b83bcc779929914cfe3a97efe66ceb6bc65fb9edca77"} Apr 24 21:26:53.216129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:53.216100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:53.216735 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:53.216214 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:26:53.216735 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:53.216239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:53.216735 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:53.216347 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8cswd" podUID="42ee90c2-09f5-4464-a75c-62352a375c5a" Apr 24 21:26:53.216735 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:53.216373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:53.216735 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:53.216463 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dlzdh" podUID="d9049978-94b6-422d-bab2-7c826163ffc7" Apr 24 21:26:54.905147 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.904969 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-58.ec2.internal" event="NodeReady" Apr 24 21:26:54.905521 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.905282 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:26:54.954014 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.953946 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7df64cf987-gz6g4"] Apr 24 21:26:54.971694 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.971669 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm"] Apr 24 21:26:54.971819 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.971704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:54.975384 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.975133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:26:54.975384 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.975206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:26:54.976845 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.976821 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:26:54.977188 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.977166 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvff9\"" Apr 24 21:26:54.981804 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:54.981782 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:26:55.001754 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.001732 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g"] Apr 24 21:26:55.001876 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.001861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" Apr 24 21:26:55.004194 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.004175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:26:55.004454 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.004436 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:26:55.004547 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.004500 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-qzvnc\"" Apr 24 21:26:55.029466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.029433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7df64cf987-gz6g4"] Apr 24 21:26:55.029466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.029463 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g"] Apr 24 21:26:55.029466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.029472 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm"] Apr 24 21:26:55.029684 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.029487 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6hc5g"] Apr 24 21:26:55.029684 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.029575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.032511 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.032494 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:26:55.033016 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.032997 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:26:55.033273 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.033241 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6p9hq\"" Apr 24 21:26:55.056799 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.056777 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lrxx8"] Apr 24 21:26:55.056982 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.056964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.059151 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqjgq\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-kube-api-access-bqjgq\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-installation-pull-secrets\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63a472ed-761c-4bea-b845-8b5b620d07ab-ca-trust-extracted\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-bound-sa-token\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-image-registry-private-configuration\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-trusted-ca\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059504 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.059504 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.059371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-certificates\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.060484 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.060464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:26:55.061203 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.061152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv4ts\"" Apr 24 21:26:55.061203 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.061181 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:26:55.061371 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.061247 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:26:55.080006 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.079985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6hc5g"] Apr 24 21:26:55.080006 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.080012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lrxx8"] Apr 24 21:26:55.080177 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.080154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.082496 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.082447 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:26:55.082966 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.082946 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:26:55.083067 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.082970 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zkwd8\"" Apr 24 21:26:55.160530 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.160700 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-certificates\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.160700 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq2r5\" (UniqueName: \"kubernetes.io/projected/c877e9e0-0af6-4ded-a68a-810fa0ab4f8e-kube-api-access-jq2r5\") pod \"network-check-source-8894fc9bd-4nlnm\" (UID: \"c877e9e0-0af6-4ded-a68a-810fa0ab4f8e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" Apr 24 21:26:55.160700 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s5lw\" (UniqueName: \"kubernetes.io/projected/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-kube-api-access-7s5lw\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.160700 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.160868 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.160868 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.160744 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:26:55.160868 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.160763 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:26:55.160868 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqjgq\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-kube-api-access-bqjgq\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161067 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.160881 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.660815935 +0000 UTC m=+33.064078908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:26:55.161067 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-installation-pull-secrets\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161067 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63a472ed-761c-4bea-b845-8b5b620d07ab-ca-trust-extracted\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161067 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.160978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-bound-sa-token\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161067 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.161006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c652c05d-6547-4c43-a295-52b3275ef5e0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.161067 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.161059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-image-registry-private-configuration\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.161086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-trusted-ca\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.161345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-certificates\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.161350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63a472ed-761c-4bea-b845-8b5b620d07ab-ca-trust-extracted\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.161901 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.161878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-trusted-ca\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.175531 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.175478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-installation-pull-secrets\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.175531 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.175478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-image-registry-private-configuration\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.177461 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.177441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-bound-sa-token\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.177566 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.177527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqjgq\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-kube-api-access-bqjgq\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.215647 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.215577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:55.215772 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.215580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:55.215838 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.215580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:55.230528 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.230508 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:26:55.230776 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.230757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vlmsd\"" Apr 24 21:26:55.230870 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.230804 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:26:55.230928 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.230868 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9whs\"" Apr 24 21:26:55.261866 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.261826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.262004 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.261939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-config-volume\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.262004 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.261964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.262004 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.261971 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:26:55.262004 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.261988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-tmp-dir\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.262199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.262025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c652c05d-6547-4c43-a295-52b3275ef5e0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.262199 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.262038 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.762019223 +0000 UTC m=+33.165282205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:26:55.262199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.262089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq2r5\" (UniqueName: \"kubernetes.io/projected/c877e9e0-0af6-4ded-a68a-810fa0ab4f8e-kube-api-access-jq2r5\") pod \"network-check-source-8894fc9bd-4nlnm\" (UID: \"c877e9e0-0af6-4ded-a68a-810fa0ab4f8e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" Apr 24 21:26:55.262199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.262117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7s5lw\" (UniqueName: \"kubernetes.io/projected/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-kube-api-access-7s5lw\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.262199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.262161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqgz\" (UniqueName: \"kubernetes.io/projected/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-kube-api-access-fjqgz\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.262199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.262192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.262493 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.262317 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:26:55.262493 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.262374 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.762356613 +0000 UTC m=+33.165619587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:26:55.262773 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.262722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c652c05d-6547-4c43-a295-52b3275ef5e0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.278738 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.278710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq2r5\" (UniqueName: \"kubernetes.io/projected/c877e9e0-0af6-4ded-a68a-810fa0ab4f8e-kube-api-access-jq2r5\") pod \"network-check-source-8894fc9bd-4nlnm\" (UID: \"c877e9e0-0af6-4ded-a68a-810fa0ab4f8e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" Apr 24 21:26:55.278856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.278710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s5lw\" (UniqueName: \"kubernetes.io/projected/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-kube-api-access-7s5lw\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.310714 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.310683 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" Apr 24 21:26:55.362934 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.362898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-config-volume\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.363090 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.363040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.363090 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.363075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-tmp-dir\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.363215 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.363140 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:26:55.363215 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.363212 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:55.863189323 +0000 UTC m=+33.266452297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:26:55.363721 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.363455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqgz\" (UniqueName: \"kubernetes.io/projected/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-kube-api-access-fjqgz\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.363721 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.363529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-config-volume\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.375469 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.375444 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-tmp-dir\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.377846 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.377826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqgz\" (UniqueName: \"kubernetes.io/projected/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-kube-api-access-fjqgz\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.666083 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.666039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:55.666281 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.666212 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:26:55.666281 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.666235 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:26:55.666361 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.666324 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.666307841 +0000 UTC m=+34.069570817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:26:55.767252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.767219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:55.767252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.767276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:55.767493 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.767391 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:26:55.767493 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.767431 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:26:55.767493 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.767470 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.76745033 +0000 UTC m=+34.170713312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:26:55.767493 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.767488 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.767479009 +0000 UTC m=+34.170741981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:26:55.868520 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.868484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:26:55.868671 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.868545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:55.868671 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.868653 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:26:55.868759 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.868677 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:26:55.868759 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.868717 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.868701284 +0000 UTC m=+65.271964259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : secret "metrics-daemon-secret" not found Apr 24 21:26:55.868759 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:55.868731 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:56.8687262 +0000 UTC m=+34.271989172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:26:55.969735 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.969660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:55.972294 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:55.972274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7kv\" (UniqueName: \"kubernetes.io/projected/42ee90c2-09f5-4464-a75c-62352a375c5a-kube-api-access-xc7kv\") pod \"network-check-target-8cswd\" (UID: \"42ee90c2-09f5-4464-a75c-62352a375c5a\") " pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:56.139534 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:56.139504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:26:56.675856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:56.675809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:56.676056 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.675929 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:26:56.676056 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.675941 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:26:56.676056 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.676004 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:26:58.675984942 +0000 UTC m=+36.079247935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:26:56.776150 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:56.776111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:56.776150 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:56.776156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:56.776405 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.776298 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:26:56.776405 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.776351 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:26:56.776405 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.776376 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:58.776355797 +0000 UTC m=+36.179618788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:26:56.776576 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.776413 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:58.776398686 +0000 UTC m=+36.179661656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:26:56.877577 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:56.877544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:56.877739 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.877719 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:26:56.877807 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:56.877795 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:58.877775724 +0000 UTC m=+36.281038699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:26:57.371661 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:57.371632 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8cswd"] Apr 24 21:26:57.375136 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:57.375108 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm"] Apr 24 21:26:57.462478 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:57.462409 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ee90c2_09f5_4464_a75c_62352a375c5a.slice/crio-5e238f4380b3ec00bfd39c8a9bb144bd43d2e5a3348fcc428a51389424fab1c2 WatchSource:0}: Error finding container 5e238f4380b3ec00bfd39c8a9bb144bd43d2e5a3348fcc428a51389424fab1c2: Status 404 returned error can't find the container with id 5e238f4380b3ec00bfd39c8a9bb144bd43d2e5a3348fcc428a51389424fab1c2 Apr 24 21:26:57.462921 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:26:57.462883 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc877e9e0_0af6_4ded_a68a_810fa0ab4f8e.slice/crio-90cdaa9273ed3023d72e319e738e7366e58622cd252313bf37bdeda2a99945b1 WatchSource:0}: Error finding container 90cdaa9273ed3023d72e319e738e7366e58622cd252313bf37bdeda2a99945b1: Status 404 returned error can't find the container with id 90cdaa9273ed3023d72e319e738e7366e58622cd252313bf37bdeda2a99945b1 Apr 24 21:26:58.450831 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.450529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8cswd" event={"ID":"42ee90c2-09f5-4464-a75c-62352a375c5a","Type":"ContainerStarted","Data":"5e238f4380b3ec00bfd39c8a9bb144bd43d2e5a3348fcc428a51389424fab1c2"} Apr 24 21:26:58.451918 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.451878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" event={"ID":"c877e9e0-0af6-4ded-a68a-810fa0ab4f8e","Type":"ContainerStarted","Data":"90cdaa9273ed3023d72e319e738e7366e58622cd252313bf37bdeda2a99945b1"} Apr 24 21:26:58.454994 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.454965 2575 generic.go:358] "Generic (PLEG): container finished" podID="8de213de-0f31-4d5f-9d53-9ba716ac7760" containerID="fbbddf5f02e73a4b8f394b2a0909eb0faa5d2945d71f17c18ef814588e7694f9" exitCode=0 Apr 24 21:26:58.455109 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.455016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerDied","Data":"fbbddf5f02e73a4b8f394b2a0909eb0faa5d2945d71f17c18ef814588e7694f9"} Apr 24 21:26:58.693485 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.693452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:26:58.693637 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.693604 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:26:58.693637 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.693624 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:26:58.693760 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.693676 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:27:02.693662285 +0000 UTC m=+40.096925256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:26:58.794047 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.793966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:26:58.794047 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.794021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:26:58.794287 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.794062 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:26:58.794287 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.794121 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:26:58.794287 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.794126 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:02.794108783 +0000 UTC m=+40.197371758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:26:58.794287 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.794192 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:02.794175336 +0000 UTC m=+40.197438321 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:26:58.895038 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:58.894998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:26:58.895204 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.895182 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:26:58.895288 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:26:58.895236 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:02.895222125 +0000 UTC m=+40.298485096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:26:59.459962 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:59.459930 2575 generic.go:358] "Generic (PLEG): container finished" podID="8de213de-0f31-4d5f-9d53-9ba716ac7760" containerID="376f19acd7b29ecdf8adc9d40ddba9da716eac101706c08693b6a7e2dd651709" exitCode=0 Apr 24 21:26:59.460631 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:59.459981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerDied","Data":"376f19acd7b29ecdf8adc9d40ddba9da716eac101706c08693b6a7e2dd651709"} Apr 24 21:26:59.904447 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:59.904409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:26:59.908422 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:26:59.908397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d9049978-94b6-422d-bab2-7c826163ffc7-original-pull-secret\") pod \"global-pull-secret-syncer-dlzdh\" (UID: \"d9049978-94b6-422d-bab2-7c826163ffc7\") " pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:27:00.026392 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:00.026345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dlzdh" Apr 24 21:27:00.583500 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:00.583470 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dlzdh"] Apr 24 21:27:00.647515 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:27:00.647473 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9049978_94b6_422d_bab2_7c826163ffc7.slice/crio-2aa8be0d0af64af8b555109ae588aefee1473acfd175104f23d526cdf28e9fd3 WatchSource:0}: Error finding container 2aa8be0d0af64af8b555109ae588aefee1473acfd175104f23d526cdf28e9fd3: Status 404 returned error can't find the container with id 2aa8be0d0af64af8b555109ae588aefee1473acfd175104f23d526cdf28e9fd3 Apr 24 21:27:01.466662 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.466621 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8cswd" event={"ID":"42ee90c2-09f5-4464-a75c-62352a375c5a","Type":"ContainerStarted","Data":"a2e2e82fd7fb0d246c6059e43b1e5045558aa251e154b6a88302981fc9f45eb7"} Apr 24 21:27:01.466853 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.466744 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:27:01.468142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.468109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" event={"ID":"c877e9e0-0af6-4ded-a68a-810fa0ab4f8e","Type":"ContainerStarted","Data":"1ffe686debfa243419beeb369dba8f9957ee32cbfe05c6668e68e6a567cc2fd5"} Apr 24 21:27:01.471735 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.471704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" event={"ID":"8de213de-0f31-4d5f-9d53-9ba716ac7760","Type":"ContainerStarted","Data":"37c266805fe93a5d4ca4721712d8ca08f85cbe36fc2a8312df91977a167e2966"} Apr 24 21:27:01.472770 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.472737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dlzdh" event={"ID":"d9049978-94b6-422d-bab2-7c826163ffc7","Type":"ContainerStarted","Data":"2aa8be0d0af64af8b555109ae588aefee1473acfd175104f23d526cdf28e9fd3"} Apr 24 21:27:01.483496 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.483456 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8cswd" podStartSLOduration=35.103847138 podStartE2EDuration="38.483443513s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:57.473124316 +0000 UTC m=+34.876387292" lastFinishedPulling="2026-04-24 21:27:00.852720695 +0000 UTC m=+38.255983667" observedRunningTime="2026-04-24 21:27:01.48241761 +0000 UTC m=+38.885680643" watchObservedRunningTime="2026-04-24 21:27:01.483443513 +0000 UTC m=+38.886706505" Apr 24 21:27:01.499822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.499768 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4nlnm" podStartSLOduration=31.302200652 podStartE2EDuration="34.49974866s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="2026-04-24 21:26:57.473148065 +0000 UTC m=+34.876411039" lastFinishedPulling="2026-04-24 21:27:00.670696077 +0000 UTC m=+38.073959047" observedRunningTime="2026-04-24 21:27:01.498395143 +0000 UTC m=+38.901658163" watchObservedRunningTime="2026-04-24 21:27:01.49974866 +0000 UTC m=+38.903011655" Apr 24 21:27:01.523394 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:01.523352 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fhmlv" podStartSLOduration=5.508733715 podStartE2EDuration="38.52333856s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:26:24.482437475 +0000 UTC m=+1.885700450" lastFinishedPulling="2026-04-24 21:26:57.497042325 +0000 UTC m=+34.900305295" observedRunningTime="2026-04-24 21:27:01.522284224 +0000 UTC m=+38.925547211" watchObservedRunningTime="2026-04-24 21:27:01.52333856 +0000 UTC m=+38.926601553" Apr 24 21:27:02.727644 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:02.727609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:27:02.728019 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.727754 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:02.728019 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.727774 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:27:02.728019 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.727838 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:27:10.72782302 +0000 UTC m=+48.131085991 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:27:02.828155 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:02.828115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:27:02.828340 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:02.828163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:27:02.828340 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.828269 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:02.828433 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.828339 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:02.828433 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.828350 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:10.828330144 +0000 UTC m=+48.231593132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:27:02.828433 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.828416 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:10.828399015 +0000 UTC m=+48.231661987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:27:02.928823 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:02.928787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:27:02.928980 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.928909 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:02.929018 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:02.928984 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:10.928966867 +0000 UTC m=+48.332229851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:27:04.479821 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:04.479788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dlzdh" event={"ID":"d9049978-94b6-422d-bab2-7c826163ffc7","Type":"ContainerStarted","Data":"274c313f23267ef80f58e4f9ea5c61c98e75ed3f9498d8d5d3b8dda666464e44"} Apr 24 21:27:04.496931 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:04.496864 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dlzdh" podStartSLOduration=32.804985091 podStartE2EDuration="36.496848994s" podCreationTimestamp="2026-04-24 21:26:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.659421759 +0000 UTC m=+38.062684735" lastFinishedPulling="2026-04-24 21:27:04.351285652 +0000 UTC m=+41.754548638" observedRunningTime="2026-04-24 21:27:04.496324407 +0000 UTC m=+41.899587405" watchObservedRunningTime="2026-04-24 21:27:04.496848994 +0000 UTC m=+41.900112017" Apr 24 21:27:10.783474 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:10.783428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:27:10.783878 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.783580 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:10.783878 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.783601 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:27:10.783878 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.783657 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.783641639 +0000 UTC m=+64.186904610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:27:10.884513 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:10.884469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:27:10.884513 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:10.884515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:27:10.884661 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.884618 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:10.884691 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.884679 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:10.884723 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.884681 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.884662288 +0000 UTC m=+64.287925260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:27:10.884764 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.884733 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.884722026 +0000 UTC m=+64.287985000 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:27:10.985864 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:10.985831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:27:10.986007 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.985988 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:10.986058 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:10.986049 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:26.98603385 +0000 UTC m=+64.389296824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:27:20.445482 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:20.445452 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-78cw4" Apr 24 21:27:26.795790 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:26.795750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:27:26.796184 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.795898 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:26.796184 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.795916 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:27:26.796184 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.795973 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.795954484 +0000 UTC m=+96.199217455 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:27:26.896208 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:26.896170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:27:26.896208 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:26.896211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:27:26.896461 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.896324 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:26.896461 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.896360 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:26.896461 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.896390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.896376242 +0000 UTC m=+96.299639213 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:27:26.896461 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.896405 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.896398555 +0000 UTC m=+96.299661526 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:27:26.997211 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:26.997181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:27:26.997374 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.997293 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:26.997374 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:26.997342 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.997329462 +0000 UTC m=+96.400592437 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:27:27.905769 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:27.905726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:27:27.906138 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:27.905865 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:27.906138 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:27.905929 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:31.905912938 +0000 UTC m=+129.309175909 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : secret "metrics-daemon-secret" not found Apr 24 21:27:32.477521 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:32.477488 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8cswd" Apr 24 21:27:58.834030 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:58.833991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:27:58.834522 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.834099 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:58.834522 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.834111 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:27:58.834522 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.834167 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.834153114 +0000 UTC m=+160.237416090 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:27:58.935016 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:58.934980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:27:58.935016 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:58.935017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:27:58.935178 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.935102 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:58.935178 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.935142 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:58.935248 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.935148 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.935135216 +0000 UTC m=+160.338398187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:27:58.935248 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:58.935237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.935220971 +0000 UTC m=+160.338483948 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:27:59.035776 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:27:59.035742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:27:59.035896 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:59.035845 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:59.035939 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:27:59.035896 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:03.035882015 +0000 UTC m=+160.439144990 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:28:31.981612 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:31.981574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:28:31.982076 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:31.981685 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:31.982076 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:31.981741 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs podName:3ad3b358-912b-477a-8fc3-6f2910580c33 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:33.981728868 +0000 UTC m=+251.384991843 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs") pod "network-metrics-daemon-nz6dq" (UID: "3ad3b358-912b-477a-8fc3-6f2910580c33") : secret "metrics-daemon-secret" not found Apr 24 21:28:53.965300 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.965246 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx"] Apr 24 21:28:53.968064 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.968041 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:53.972213 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.972191 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:53.972437 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.972422 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:28:53.972774 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.972747 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:53.977108 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.977088 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zffmk\"" Apr 24 21:28:53.996545 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:53.996522 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx"] Apr 24 21:28:54.041696 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.041667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbmb\" (UniqueName: \"kubernetes.io/projected/42fc120d-c79e-43df-9fb2-d09911667b69-kube-api-access-llbmb\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:54.041843 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.041719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:54.068018 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.067993 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mksv7"] Apr 24 21:28:54.070713 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.070693 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.073002 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.072980 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:28:54.073002 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.072991 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.073239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.073078 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:28:54.073239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.073095 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rlkm6\"" Apr 24 21:28:54.073364 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.073350 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.078919 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.078902 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:28:54.079121 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.079099 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mksv7"] Apr 24 21:28:54.142817 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.142784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llbmb\" (UniqueName: \"kubernetes.io/projected/42fc120d-c79e-43df-9fb2-d09911667b69-kube-api-access-llbmb\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:54.142951 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.142819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/057fa386-6c87-478a-91d9-c2293ba0617c-config\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.142951 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.142851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/057fa386-6c87-478a-91d9-c2293ba0617c-trusted-ca\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.142951 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.142913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:54.142951 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.142937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057fa386-6c87-478a-91d9-c2293ba0617c-serving-cert\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.143089 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.142979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xvph\" (UniqueName: \"kubernetes.io/projected/057fa386-6c87-478a-91d9-c2293ba0617c-kube-api-access-9xvph\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.143089 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:54.143017 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:54.143089 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:54.143074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls podName:42fc120d-c79e-43df-9fb2-d09911667b69 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:54.643061068 +0000 UTC m=+152.046324039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kkdhx" (UID: "42fc120d-c79e-43df-9fb2-d09911667b69") : secret "samples-operator-tls" not found Apr 24 21:28:54.151716 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.151695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbmb\" (UniqueName: \"kubernetes.io/projected/42fc120d-c79e-43df-9fb2-d09911667b69-kube-api-access-llbmb\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:54.184073 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.184046 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml"] Apr 24 21:28:54.186875 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.186858 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.189974 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.189957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:28:54.190068 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.190035 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:28:54.190068 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.190041 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:54.190158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.190106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-sns7m\"" Apr 24 21:28:54.191185 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.191172 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:54.198649 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.198633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml"] Apr 24 21:28:54.244074 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52dr\" (UniqueName: \"kubernetes.io/projected/1ce69927-aa5a-4504-8d79-b16c4a102d54-kube-api-access-l52dr\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.244074 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/057fa386-6c87-478a-91d9-c2293ba0617c-config\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.244074 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/057fa386-6c87-478a-91d9-c2293ba0617c-trusted-ca\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.244369 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057fa386-6c87-478a-91d9-c2293ba0617c-serving-cert\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.244369 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce69927-aa5a-4504-8d79-b16c4a102d54-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.244369 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xvph\" (UniqueName: \"kubernetes.io/projected/057fa386-6c87-478a-91d9-c2293ba0617c-kube-api-access-9xvph\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.245040 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.244467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce69927-aa5a-4504-8d79-b16c4a102d54-config\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.245226 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.245201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/057fa386-6c87-478a-91d9-c2293ba0617c-config\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.247837 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.247063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/057fa386-6c87-478a-91d9-c2293ba0617c-trusted-ca\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.249599 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.249580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057fa386-6c87-478a-91d9-c2293ba0617c-serving-cert\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.253831 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.253808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xvph\" (UniqueName: \"kubernetes.io/projected/057fa386-6c87-478a-91d9-c2293ba0617c-kube-api-access-9xvph\") pod \"console-operator-9d4b6777b-mksv7\" (UID: \"057fa386-6c87-478a-91d9-c2293ba0617c\") " pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.345183 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.345151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l52dr\" (UniqueName: \"kubernetes.io/projected/1ce69927-aa5a-4504-8d79-b16c4a102d54-kube-api-access-l52dr\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.345355 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.345207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce69927-aa5a-4504-8d79-b16c4a102d54-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.345355 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.345225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce69927-aa5a-4504-8d79-b16c4a102d54-config\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.345689 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.345671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce69927-aa5a-4504-8d79-b16c4a102d54-config\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.347335 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.347313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce69927-aa5a-4504-8d79-b16c4a102d54-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.355320 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.355300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52dr\" (UniqueName: \"kubernetes.io/projected/1ce69927-aa5a-4504-8d79-b16c4a102d54-kube-api-access-l52dr\") pod \"service-ca-operator-d6fc45fc5-zw7ml\" (UID: \"1ce69927-aa5a-4504-8d79-b16c4a102d54\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.381231 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.381213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:28:54.494017 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.493986 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mksv7"] Apr 24 21:28:54.495881 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.495862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" Apr 24 21:28:54.497271 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:28:54.497221 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057fa386_6c87_478a_91d9_c2293ba0617c.slice/crio-db3286f1c0e199958b593bda5e5e91a3229f292dcab4f5af1a917a7afd2a02b9 WatchSource:0}: Error finding container db3286f1c0e199958b593bda5e5e91a3229f292dcab4f5af1a917a7afd2a02b9: Status 404 returned error can't find the container with id db3286f1c0e199958b593bda5e5e91a3229f292dcab4f5af1a917a7afd2a02b9 Apr 24 21:28:54.609539 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.609509 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml"] Apr 24 21:28:54.612637 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:28:54.612606 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce69927_aa5a_4504_8d79_b16c4a102d54.slice/crio-db93d46ce722c75c2f4c6616437b83a4defbb18feaaf4006ef028fe7b59a4b24 WatchSource:0}: Error finding container db93d46ce722c75c2f4c6616437b83a4defbb18feaaf4006ef028fe7b59a4b24: Status 404 returned error can't find the container with id db93d46ce722c75c2f4c6616437b83a4defbb18feaaf4006ef028fe7b59a4b24 Apr 24 21:28:54.648366 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.648336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:54.648507 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:54.648478 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:54.648558 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:54.648551 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls podName:42fc120d-c79e-43df-9fb2-d09911667b69 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:55.648534256 +0000 UTC m=+153.051797231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kkdhx" (UID: "42fc120d-c79e-43df-9fb2-d09911667b69") : secret "samples-operator-tls" not found Apr 24 21:28:54.677520 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.677483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" event={"ID":"1ce69927-aa5a-4504-8d79-b16c4a102d54","Type":"ContainerStarted","Data":"db93d46ce722c75c2f4c6616437b83a4defbb18feaaf4006ef028fe7b59a4b24"} Apr 24 21:28:54.678374 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:54.678355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" event={"ID":"057fa386-6c87-478a-91d9-c2293ba0617c","Type":"ContainerStarted","Data":"db3286f1c0e199958b593bda5e5e91a3229f292dcab4f5af1a917a7afd2a02b9"} Apr 24 21:28:55.655660 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:55.655620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:55.656148 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:55.655775 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:55.656148 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:55.655856 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls podName:42fc120d-c79e-43df-9fb2-d09911667b69 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:57.655831847 +0000 UTC m=+155.059094832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kkdhx" (UID: "42fc120d-c79e-43df-9fb2-d09911667b69") : secret "samples-operator-tls" not found Apr 24 21:28:57.671835 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.671803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:28:57.672216 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:57.671947 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:57.672216 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:57.672006 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls podName:42fc120d-c79e-43df-9fb2-d09911667b69 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:01.671991259 +0000 UTC m=+159.075254236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kkdhx" (UID: "42fc120d-c79e-43df-9fb2-d09911667b69") : secret "samples-operator-tls" not found Apr 24 21:28:57.686158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.686127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" event={"ID":"1ce69927-aa5a-4504-8d79-b16c4a102d54","Type":"ContainerStarted","Data":"bb2ac1e927420dbfd8a5a2ead6e974a23ac3b86faaeb94468f1235ddb12f27b1"} Apr 24 21:28:57.687550 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.687528 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/0.log" Apr 24 21:28:57.687673 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.687570 2575 generic.go:358] "Generic (PLEG): container finished" podID="057fa386-6c87-478a-91d9-c2293ba0617c" containerID="9056a0a8a7ea2d833e5353874895cd48146b322c813f9dbca58ff997bdd0b1fe" exitCode=255 Apr 24 21:28:57.687673 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.687615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" event={"ID":"057fa386-6c87-478a-91d9-c2293ba0617c","Type":"ContainerDied","Data":"9056a0a8a7ea2d833e5353874895cd48146b322c813f9dbca58ff997bdd0b1fe"} Apr 24 21:28:57.687789 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.687775 2575 scope.go:117] "RemoveContainer" containerID="9056a0a8a7ea2d833e5353874895cd48146b322c813f9dbca58ff997bdd0b1fe" Apr 24 21:28:57.704154 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:57.704116 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" podStartSLOduration=1.574131252 podStartE2EDuration="3.704105217s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:54.614364316 +0000 UTC m=+152.017627290" lastFinishedPulling="2026-04-24 21:28:56.744338271 +0000 UTC m=+154.147601255" observedRunningTime="2026-04-24 21:28:57.703752424 +0000 UTC m=+155.107015417" watchObservedRunningTime="2026-04-24 21:28:57.704105217 +0000 UTC m=+155.107368240" Apr 24 21:28:57.983041 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:57.982938 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" podUID="63a472ed-761c-4bea-b845-8b5b620d07ab" Apr 24 21:28:58.039476 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:58.039439 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" podUID="c652c05d-6547-4c43-a295-52b3275ef5e0" Apr 24 21:28:58.066352 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:58.066318 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6hc5g" podUID="7eb5a6c5-93dc-4c4b-a6ab-b457966b4540" Apr 24 21:28:58.088627 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:58.088600 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lrxx8" podUID="5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33" Apr 24 21:28:58.233278 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:58.233174 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-nz6dq" podUID="3ad3b358-912b-477a-8fc3-6f2910580c33" Apr 24 21:28:58.560964 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.560885 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597"] Apr 24 21:28:58.563834 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.563818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" Apr 24 21:28:58.566190 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.566172 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:58.566332 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.566208 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qx8c2\"" Apr 24 21:28:58.570158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.567303 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:28:58.574703 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.574680 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597"] Apr 24 21:28:58.680946 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.680912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mnsc\" (UniqueName: \"kubernetes.io/projected/4a70399d-9352-4aae-bc56-8a355a0872ff-kube-api-access-7mnsc\") pod \"migrator-74bb7799d9-8t597\" (UID: \"4a70399d-9352-4aae-bc56-8a355a0872ff\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" Apr 24 21:28:58.691000 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.690978 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/1.log" Apr 24 21:28:58.691344 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691330 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/0.log" Apr 24 21:28:58.691420 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691363 2575 generic.go:358] "Generic (PLEG): container finished" podID="057fa386-6c87-478a-91d9-c2293ba0617c" containerID="127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9" exitCode=255 Apr 24 21:28:58.691482 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:28:58.691539 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" event={"ID":"057fa386-6c87-478a-91d9-c2293ba0617c","Type":"ContainerDied","Data":"127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9"} Apr 24 21:28:58.691539 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691532 2575 scope.go:117] "RemoveContainer" containerID="9056a0a8a7ea2d833e5353874895cd48146b322c813f9dbca58ff997bdd0b1fe" Apr 24 21:28:58.691771 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691744 2575 scope.go:117] "RemoveContainer" containerID="127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9" Apr 24 21:28:58.691949 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.691933 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:28:58.692053 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:58.691932 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mksv7_openshift-console-operator(057fa386-6c87-478a-91d9-c2293ba0617c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podUID="057fa386-6c87-478a-91d9-c2293ba0617c" Apr 24 21:28:58.781548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.781505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnsc\" (UniqueName: \"kubernetes.io/projected/4a70399d-9352-4aae-bc56-8a355a0872ff-kube-api-access-7mnsc\") pod \"migrator-74bb7799d9-8t597\" (UID: \"4a70399d-9352-4aae-bc56-8a355a0872ff\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" Apr 24 21:28:58.790534 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.790507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnsc\" (UniqueName: \"kubernetes.io/projected/4a70399d-9352-4aae-bc56-8a355a0872ff-kube-api-access-7mnsc\") pod \"migrator-74bb7799d9-8t597\" (UID: \"4a70399d-9352-4aae-bc56-8a355a0872ff\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" Apr 24 21:28:58.876476 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.876402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" Apr 24 21:28:58.989157 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:58.989130 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597"] Apr 24 21:28:58.992181 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:28:58.992155 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a70399d_9352_4aae_bc56_8a355a0872ff.slice/crio-2f6307e275b6782f1ac5c65daba8aeaefee168054bbb3471da6f0cf1026ba579 WatchSource:0}: Error finding container 2f6307e275b6782f1ac5c65daba8aeaefee168054bbb3471da6f0cf1026ba579: Status 404 returned error can't find the container with id 2f6307e275b6782f1ac5c65daba8aeaefee168054bbb3471da6f0cf1026ba579 Apr 24 21:28:59.695466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:59.695436 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/1.log" Apr 24 21:28:59.695884 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:59.695812 2575 scope.go:117] "RemoveContainer" containerID="127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9" Apr 24 21:28:59.696021 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:28:59.695999 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mksv7_openshift-console-operator(057fa386-6c87-478a-91d9-c2293ba0617c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podUID="057fa386-6c87-478a-91d9-c2293ba0617c" Apr 24 21:28:59.696484 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:28:59.696460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" event={"ID":"4a70399d-9352-4aae-bc56-8a355a0872ff","Type":"ContainerStarted","Data":"2f6307e275b6782f1ac5c65daba8aeaefee168054bbb3471da6f0cf1026ba579"} Apr 24 21:29:00.261120 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:00.261047 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r2v97_92515838-c368-40ef-9e3e-40f753dd0308/dns-node-resolver/0.log" Apr 24 21:29:00.701213 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:00.701182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" event={"ID":"4a70399d-9352-4aae-bc56-8a355a0872ff","Type":"ContainerStarted","Data":"3d0374a588a2e74536bfd15e0d930b58909b551e50d55b6161eb56f87b6ad0de"} Apr 24 21:29:00.701213 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:00.701216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" event={"ID":"4a70399d-9352-4aae-bc56-8a355a0872ff","Type":"ContainerStarted","Data":"6939c3941ec27d4e0bb9a79ea4eb6c79e7c0cb16557a03c9d2372325ccaef250"} Apr 24 21:29:00.717172 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:00.717127 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8t597" podStartSLOduration=1.7009661980000002 podStartE2EDuration="2.71711293s" podCreationTimestamp="2026-04-24 21:28:58 +0000 UTC" firstStartedPulling="2026-04-24 21:28:58.993939708 +0000 UTC m=+156.397202683" lastFinishedPulling="2026-04-24 21:29:00.010086441 +0000 UTC m=+157.413349415" observedRunningTime="2026-04-24 21:29:00.716482969 +0000 UTC m=+158.119745964" watchObservedRunningTime="2026-04-24 21:29:00.71711293 +0000 UTC m=+158.120375923" Apr 24 21:29:00.860028 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:00.860005 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j8p2r_9062ddad-735b-4bad-80c5-03e7de6d3add/node-ca/0.log" Apr 24 21:29:01.702385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:01.702349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:29:01.702774 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:01.702483 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:29:01.702774 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:01.702555 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls podName:42fc120d-c79e-43df-9fb2-d09911667b69 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.7025391 +0000 UTC m=+167.105802076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kkdhx" (UID: "42fc120d-c79e-43df-9fb2-d09911667b69") : secret "samples-operator-tls" not found Apr 24 21:29:02.912649 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:02.912619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") pod \"image-registry-7df64cf987-gz6g4\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:29:02.913042 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:02.912790 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:02.913042 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:02.912810 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7df64cf987-gz6g4: secret "image-registry-tls" not found Apr 24 21:29:02.913042 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:02.912869 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls podName:63a472ed-761c-4bea-b845-8b5b620d07ab nodeName:}" failed. No retries permitted until 2026-04-24 21:31:04.912853615 +0000 UTC m=+282.316116586 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls") pod "image-registry-7df64cf987-gz6g4" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab") : secret "image-registry-tls" not found Apr 24 21:29:03.013885 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:03.013851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:29:03.014114 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:03.013994 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:03.014114 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:03.014006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:29:03.014114 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:03.014058 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert podName:c652c05d-6547-4c43-a295-52b3275ef5e0 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:05.014043243 +0000 UTC m=+282.417306215 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mdm8g" (UID: "c652c05d-6547-4c43-a295-52b3275ef5e0") : secret "networking-console-plugin-cert" not found Apr 24 21:29:03.014114 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:03.014097 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:03.014307 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:03.014133 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert podName:7eb5a6c5-93dc-4c4b-a6ab-b457966b4540 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:05.014122569 +0000 UTC m=+282.417385540 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert") pod "ingress-canary-6hc5g" (UID: "7eb5a6c5-93dc-4c4b-a6ab-b457966b4540") : secret "canary-serving-cert" not found Apr 24 21:29:03.115377 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:03.115343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:29:03.115522 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:03.115449 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:03.115522 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:03.115496 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls podName:5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:05.115482906 +0000 UTC m=+282.518745876 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls") pod "dns-default-lrxx8" (UID: "5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33") : secret "dns-default-metrics-tls" not found Apr 24 21:29:04.382198 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:04.382157 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:29:04.382198 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:04.382197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:29:04.382645 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:04.382602 2575 scope.go:117] "RemoveContainer" containerID="127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9" Apr 24 21:29:04.382783 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:04.382763 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mksv7_openshift-console-operator(057fa386-6c87-478a-91d9-c2293ba0617c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podUID="057fa386-6c87-478a-91d9-c2293ba0617c" Apr 24 21:29:09.767725 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:09.767638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:29:09.769981 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:09.769959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42fc120d-c79e-43df-9fb2-d09911667b69-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kkdhx\" (UID: \"42fc120d-c79e-43df-9fb2-d09911667b69\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:29:09.876543 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:09.876508 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" Apr 24 21:29:09.992236 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:09.992203 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx"] Apr 24 21:29:10.722934 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:10.722898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" event={"ID":"42fc120d-c79e-43df-9fb2-d09911667b69","Type":"ContainerStarted","Data":"6a7213291eea4f3b7322dc517763251777bd14b3d4e11ed7b79da92d2b6330c8"} Apr 24 21:29:11.214874 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:11.214836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:29:11.726736 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:11.726707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" event={"ID":"42fc120d-c79e-43df-9fb2-d09911667b69","Type":"ContainerStarted","Data":"220425f62455ede1c6f5569a3d878b8b954bf864d65aaf38e60d524b6bf63921"} Apr 24 21:29:11.726736 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:11.726741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" event={"ID":"42fc120d-c79e-43df-9fb2-d09911667b69","Type":"ContainerStarted","Data":"e940bba3785131834c82f0973021559b403d686b052c62da2b856b26de6197c9"} Apr 24 21:29:11.747690 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:11.747634 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kkdhx" podStartSLOduration=17.279279187 podStartE2EDuration="18.747621146s" podCreationTimestamp="2026-04-24 21:28:53 +0000 UTC" firstStartedPulling="2026-04-24 21:29:10.036080497 +0000 UTC m=+167.439343472" lastFinishedPulling="2026-04-24 21:29:11.504422448 +0000 UTC m=+168.907685431" observedRunningTime="2026-04-24 21:29:11.746151876 +0000 UTC m=+169.149414871" watchObservedRunningTime="2026-04-24 21:29:11.747621146 +0000 UTC m=+169.150884138" Apr 24 21:29:13.218187 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:13.218154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lrxx8" Apr 24 21:29:13.218629 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:13.218154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:29:19.215875 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.215842 2575 scope.go:117] "RemoveContainer" containerID="127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9" Apr 24 21:29:19.750750 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.750724 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:29:19.751035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.751021 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/1.log" Apr 24 21:29:19.751079 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.751054 2575 generic.go:358] "Generic (PLEG): container finished" podID="057fa386-6c87-478a-91d9-c2293ba0617c" containerID="d187b68777c231e0e97639e78d03dd2a1eb5c62bb4723e6d494ba414c8ed7c5d" exitCode=255 Apr 24 21:29:19.751119 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.751101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" event={"ID":"057fa386-6c87-478a-91d9-c2293ba0617c","Type":"ContainerDied","Data":"d187b68777c231e0e97639e78d03dd2a1eb5c62bb4723e6d494ba414c8ed7c5d"} Apr 24 21:29:19.751153 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.751130 2575 scope.go:117] "RemoveContainer" containerID="127c0f7fef252207c59a7f92a87b1e591231ad506dafd24edd49503941427ce9" Apr 24 21:29:19.751490 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:19.751471 2575 scope.go:117] "RemoveContainer" containerID="d187b68777c231e0e97639e78d03dd2a1eb5c62bb4723e6d494ba414c8ed7c5d" Apr 24 21:29:19.751671 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:19.751646 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mksv7_openshift-console-operator(057fa386-6c87-478a-91d9-c2293ba0617c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podUID="057fa386-6c87-478a-91d9-c2293ba0617c" Apr 24 21:29:20.754825 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:20.754796 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:29:21.869803 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.869776 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-znrss"] Apr 24 21:29:21.873355 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.873337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:21.879751 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.879732 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:29:21.879855 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.879785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:29:21.879855 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.879804 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:29:21.882042 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.882026 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:29:21.887598 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.887582 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sdl4k\"" Apr 24 21:29:21.894082 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:21.894064 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-znrss"] Apr 24 21:29:22.064437 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.064404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-crio-socket\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.064615 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.064472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.064615 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.064508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.064615 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.064552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-data-volume\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.064783 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.064677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrqv\" (UniqueName: \"kubernetes.io/projected/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-kube-api-access-fkrqv\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.165584 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.165557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrqv\" (UniqueName: \"kubernetes.io/projected/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-kube-api-access-fkrqv\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.165719 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.165609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-crio-socket\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.165719 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.165647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.165719 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.165671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.165719 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.165704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-data-volume\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.165849 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.165770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-crio-socket\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.166114 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.166098 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-data-volume\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.166307 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.166290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.167991 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.167973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.222154 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.222121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrqv\" (UniqueName: \"kubernetes.io/projected/f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5-kube-api-access-fkrqv\") pod \"insights-runtime-extractor-znrss\" (UID: \"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5\") " pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.481505 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.481429 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-znrss" Apr 24 21:29:22.617029 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.616987 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-znrss"] Apr 24 21:29:22.620590 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:22.620557 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4902d02_2d55_4fcb_b3bc_5f3dd8f847a5.slice/crio-ded3aca9d7cb4aae8a0e9fcbce4cb1015676799ef958324f19dd76889d8e6325 WatchSource:0}: Error finding container ded3aca9d7cb4aae8a0e9fcbce4cb1015676799ef958324f19dd76889d8e6325: Status 404 returned error can't find the container with id ded3aca9d7cb4aae8a0e9fcbce4cb1015676799ef958324f19dd76889d8e6325 Apr 24 21:29:22.762825 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.762746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znrss" event={"ID":"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5","Type":"ContainerStarted","Data":"9581aaf8e17973f2859dc59668d1c2f719accf423958753e3a43a933153d456b"} Apr 24 21:29:22.762825 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:22.762782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znrss" event={"ID":"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5","Type":"ContainerStarted","Data":"ded3aca9d7cb4aae8a0e9fcbce4cb1015676799ef958324f19dd76889d8e6325"} Apr 24 21:29:23.766951 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:23.766916 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znrss" event={"ID":"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5","Type":"ContainerStarted","Data":"f09eba02fc1c536ee294dfe3349a1680baf53ce1655a81ee0c6ddad8c34ea0b1"} Apr 24 21:29:24.381391 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:24.381352 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:29:24.381391 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:24.381390 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:29:24.381782 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:24.381763 2575 scope.go:117] "RemoveContainer" containerID="d187b68777c231e0e97639e78d03dd2a1eb5c62bb4723e6d494ba414c8ed7c5d" Apr 24 21:29:24.382008 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:24.381990 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mksv7_openshift-console-operator(057fa386-6c87-478a-91d9-c2293ba0617c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podUID="057fa386-6c87-478a-91d9-c2293ba0617c" Apr 24 21:29:24.770901 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:24.770865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-znrss" event={"ID":"f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5","Type":"ContainerStarted","Data":"a6c23c8feccecbf4160b50a9083831e70f920da5c450586a107f0ef62e10fdc4"} Apr 24 21:29:24.790672 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:24.790626 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-znrss" podStartSLOduration=1.931065067 podStartE2EDuration="3.790613497s" podCreationTimestamp="2026-04-24 21:29:21 +0000 UTC" firstStartedPulling="2026-04-24 21:29:22.670843335 +0000 UTC m=+180.074106323" lastFinishedPulling="2026-04-24 21:29:24.530391779 +0000 UTC m=+181.933654753" observedRunningTime="2026-04-24 21:29:24.789907266 +0000 UTC m=+182.193170258" watchObservedRunningTime="2026-04-24 21:29:24.790613497 +0000 UTC m=+182.193876489" Apr 24 21:29:28.830674 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:28.830641 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh"] Apr 24 21:29:28.833802 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:28.833787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:28.835978 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:28.835953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-8xz54\"" Apr 24 21:29:28.836092 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:28.835954 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:29:28.840988 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:28.840971 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh"] Apr 24 21:29:28.916540 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:28.916509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8edc3df6-4a9e-45bf-bfea-3bbff392cd1d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xghjh\" (UID: \"8edc3df6-4a9e-45bf-bfea-3bbff392cd1d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:29.017116 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:29.017089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8edc3df6-4a9e-45bf-bfea-3bbff392cd1d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xghjh\" (UID: \"8edc3df6-4a9e-45bf-bfea-3bbff392cd1d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:29.017233 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:29.017219 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 21:29:29.017305 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:29.017295 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8edc3df6-4a9e-45bf-bfea-3bbff392cd1d-tls-certificates podName:8edc3df6-4a9e-45bf-bfea-3bbff392cd1d nodeName:}" failed. No retries permitted until 2026-04-24 21:29:29.517281585 +0000 UTC m=+186.920544556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/8edc3df6-4a9e-45bf-bfea-3bbff392cd1d-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-xghjh" (UID: "8edc3df6-4a9e-45bf-bfea-3bbff392cd1d") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 21:29:29.519977 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:29.519943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8edc3df6-4a9e-45bf-bfea-3bbff392cd1d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xghjh\" (UID: \"8edc3df6-4a9e-45bf-bfea-3bbff392cd1d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:29.522353 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:29.522329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8edc3df6-4a9e-45bf-bfea-3bbff392cd1d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xghjh\" (UID: \"8edc3df6-4a9e-45bf-bfea-3bbff392cd1d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:29.742618 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:29.742588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:29.860171 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:29.860117 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh"] Apr 24 21:29:29.862366 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:29.862339 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8edc3df6_4a9e_45bf_bfea_3bbff392cd1d.slice/crio-4f00133a448155714ab335745b22426fb670a3d9cb282d0e9573a776c5682598 WatchSource:0}: Error finding container 4f00133a448155714ab335745b22426fb670a3d9cb282d0e9573a776c5682598: Status 404 returned error can't find the container with id 4f00133a448155714ab335745b22426fb670a3d9cb282d0e9573a776c5682598 Apr 24 21:29:30.788242 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:30.788210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" event={"ID":"8edc3df6-4a9e-45bf-bfea-3bbff392cd1d","Type":"ContainerStarted","Data":"4f00133a448155714ab335745b22426fb670a3d9cb282d0e9573a776c5682598"} Apr 24 21:29:31.792690 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:31.792651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" event={"ID":"8edc3df6-4a9e-45bf-bfea-3bbff392cd1d","Type":"ContainerStarted","Data":"272a8e0362885852877dc82dc15c70b59066884d1e1e129aca2301c514242e44"} Apr 24 21:29:31.793046 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:31.792859 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:31.797270 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:31.797228 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" Apr 24 21:29:31.810341 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:31.810301 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xghjh" podStartSLOduration=2.839318065 podStartE2EDuration="3.810289609s" podCreationTimestamp="2026-04-24 21:29:28 +0000 UTC" firstStartedPulling="2026-04-24 21:29:29.864097038 +0000 UTC m=+187.267360014" lastFinishedPulling="2026-04-24 21:29:30.835068574 +0000 UTC m=+188.238331558" observedRunningTime="2026-04-24 21:29:31.809168149 +0000 UTC m=+189.212431141" watchObservedRunningTime="2026-04-24 21:29:31.810289609 +0000 UTC m=+189.213552603" Apr 24 21:29:37.262105 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.262074 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz"] Apr 24 21:29:37.265112 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.265096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.267704 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.267684 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-djnts\"" Apr 24 21:29:37.268589 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.268565 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:29:37.268589 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.268577 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:29:37.268736 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.268577 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:29:37.269354 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.269333 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:29:37.269546 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.269521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:29:37.270095 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.270074 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-kkghd"] Apr 24 21:29:37.273053 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.273038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.276945 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.276923 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:29:37.277113 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.276929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:29:37.277203 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.277171 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:29:37.277290 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.277239 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k4448\"" Apr 24 21:29:37.281731 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.281708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz"] Apr 24 21:29:37.281852 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.281836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66d6x\" (UniqueName: \"kubernetes.io/projected/cf2ad258-2bb0-493a-98f7-41c11632fd79-kube-api-access-66d6x\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.281896 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.281868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cf2ad258-2bb0-493a-98f7-41c11632fd79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.281940 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.281912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf2ad258-2bb0-493a-98f7-41c11632fd79-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.281990 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.281977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf2ad258-2bb0-493a-98f7-41c11632fd79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.287906 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.287883 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jrjgl"] Apr 24 21:29:37.290792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.290778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.293519 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.293487 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:29:37.293711 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.293558 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:29:37.293829 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.293813 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-v9v8c\"" Apr 24 21:29:37.294110 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.294097 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:29:37.301586 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.301566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jrjgl"] Apr 24 21:29:37.382904 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.382869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cf2ad258-2bb0-493a-98f7-41c11632fd79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.382904 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.382906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-tls\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.382928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-wtmp\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.382972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-root\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.382999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf2ad258-2bb0-493a-98f7-41c11632fd79-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-textfile\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-accelerators-collector-config\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.383129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf2ad258-2bb0-493a-98f7-41c11632fd79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44lsw\" (UniqueName: \"kubernetes.io/projected/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-api-access-44lsw\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383223 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6e5f2196-7905-4092-b23c-1f63b39dc528-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e5f2196-7905-4092-b23c-1f63b39dc528-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14046a77-27b2-4686-90d9-2b6f59d97707-metrics-client-ca\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66d6x\" (UniqueName: \"kubernetes.io/projected/cf2ad258-2bb0-493a-98f7-41c11632fd79-kube-api-access-66d6x\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-sys\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.383580 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb64q\" (UniqueName: \"kubernetes.io/projected/14046a77-27b2-4686-90d9-2b6f59d97707-kube-api-access-nb64q\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.384079 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.383669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf2ad258-2bb0-493a-98f7-41c11632fd79-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.385492 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.385473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf2ad258-2bb0-493a-98f7-41c11632fd79-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.385584 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.385475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cf2ad258-2bb0-493a-98f7-41c11632fd79-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.395171 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.395145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66d6x\" (UniqueName: \"kubernetes.io/projected/cf2ad258-2bb0-493a-98f7-41c11632fd79-kube-api-access-66d6x\") pod \"openshift-state-metrics-9d44df66c-t4gjz\" (UID: \"cf2ad258-2bb0-493a-98f7-41c11632fd79\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.484298 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484298 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44lsw\" (UniqueName: \"kubernetes.io/projected/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-api-access-44lsw\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.484516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6e5f2196-7905-4092-b23c-1f63b39dc528-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e5f2196-7905-4092-b23c-1f63b39dc528-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484516 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14046a77-27b2-4686-90d9-2b6f59d97707-metrics-client-ca\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.484822 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:37.484565 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:29:37.484822 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:37.484687 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-tls podName:6e5f2196-7905-4092-b23c-1f63b39dc528 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.984653237 +0000 UTC m=+195.387916218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jrjgl" (UID: "6e5f2196-7905-4092-b23c-1f63b39dc528") : secret "kube-state-metrics-tls" not found Apr 24 21:29:37.484822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-sys\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.484822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nb64q\" (UniqueName: \"kubernetes.io/projected/14046a77-27b2-4686-90d9-2b6f59d97707-kube-api-access-nb64q\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.484822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/6e5f2196-7905-4092-b23c-1f63b39dc528-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.484822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-tls\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-wtmp\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484853 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-sys\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-root\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-textfile\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.484991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-accelerators-collector-config\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14046a77-27b2-4686-90d9-2b6f59d97707-metrics-client-ca\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485473 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.485473 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-wtmp\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485473 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14046a77-27b2-4686-90d9-2b6f59d97707-root\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485473 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:37.485322 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:29:37.485473 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:37.485382 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-tls podName:14046a77-27b2-4686-90d9-2b6f59d97707 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.985363444 +0000 UTC m=+195.388626420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-tls") pod "node-exporter-kkghd" (UID: "14046a77-27b2-4686-90d9-2b6f59d97707") : secret "node-exporter-tls" not found Apr 24 21:29:37.485473 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e5f2196-7905-4092-b23c-1f63b39dc528-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.485680 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-textfile\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.485680 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.485547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-accelerators-collector-config\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.486742 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.486719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.486790 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.486742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.497038 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.497011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb64q\" (UniqueName: \"kubernetes.io/projected/14046a77-27b2-4686-90d9-2b6f59d97707-kube-api-access-nb64q\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.497038 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.497022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44lsw\" (UniqueName: \"kubernetes.io/projected/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-api-access-44lsw\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.574487 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.574434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" Apr 24 21:29:37.697228 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.697199 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz"] Apr 24 21:29:37.700399 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:37.700374 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2ad258_2bb0_493a_98f7_41c11632fd79.slice/crio-f0f43af44536fcdf074320462b04bcaa71c5b11283bd9edcef1a35aaa7dad4ec WatchSource:0}: Error finding container f0f43af44536fcdf074320462b04bcaa71c5b11283bd9edcef1a35aaa7dad4ec: Status 404 returned error can't find the container with id f0f43af44536fcdf074320462b04bcaa71c5b11283bd9edcef1a35aaa7dad4ec Apr 24 21:29:37.808948 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.808912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" event={"ID":"cf2ad258-2bb0-493a-98f7-41c11632fd79","Type":"ContainerStarted","Data":"8661074eaf5c4659a1425fb643ff41342c25722e95ab1e9195b5eab0e12e2f18"} Apr 24 21:29:37.808948 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.808946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" event={"ID":"cf2ad258-2bb0-493a-98f7-41c11632fd79","Type":"ContainerStarted","Data":"f0f43af44536fcdf074320462b04bcaa71c5b11283bd9edcef1a35aaa7dad4ec"} Apr 24 21:29:37.989138 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.989098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:37.989356 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.989178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-tls\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.991698 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.991669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14046a77-27b2-4686-90d9-2b6f59d97707-node-exporter-tls\") pod \"node-exporter-kkghd\" (UID: \"14046a77-27b2-4686-90d9-2b6f59d97707\") " pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:37.992124 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:37.992103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e5f2196-7905-4092-b23c-1f63b39dc528-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jrjgl\" (UID: \"6e5f2196-7905-4092-b23c-1f63b39dc528\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:38.181704 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.181673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-kkghd" Apr 24 21:29:38.189310 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:38.189282 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14046a77_27b2_4686_90d9_2b6f59d97707.slice/crio-17f201d09b71d6d4178b8b7f332209a4ebe5b0b6df3d63314e9517c3a9f871d4 WatchSource:0}: Error finding container 17f201d09b71d6d4178b8b7f332209a4ebe5b0b6df3d63314e9517c3a9f871d4: Status 404 returned error can't find the container with id 17f201d09b71d6d4178b8b7f332209a4ebe5b0b6df3d63314e9517c3a9f871d4 Apr 24 21:29:38.199254 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.199232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" Apr 24 21:29:38.215542 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.215518 2575 scope.go:117] "RemoveContainer" containerID="d187b68777c231e0e97639e78d03dd2a1eb5c62bb4723e6d494ba414c8ed7c5d" Apr 24 21:29:38.215736 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:38.215717 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mksv7_openshift-console-operator(057fa386-6c87-478a-91d9-c2293ba0617c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podUID="057fa386-6c87-478a-91d9-c2293ba0617c" Apr 24 21:29:38.323930 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.323824 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jrjgl"] Apr 24 21:29:38.326879 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:38.326852 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e5f2196_7905_4092_b23c_1f63b39dc528.slice/crio-befd0ce7c13fe21987590aaa3d3bac4418c24dc354f495d0886441041cebeccb WatchSource:0}: Error finding container befd0ce7c13fe21987590aaa3d3bac4418c24dc354f495d0886441041cebeccb: Status 404 returned error can't find the container with id befd0ce7c13fe21987590aaa3d3bac4418c24dc354f495d0886441041cebeccb Apr 24 21:29:38.370229 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.370205 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:38.375088 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.375069 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.378000 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.377924 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:29:38.378000 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.377924 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:29:38.378000 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.377930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:29:38.378210 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.377973 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:29:38.378472 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.378450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-l6lqq\"" Apr 24 21:29:38.378568 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.378456 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:29:38.378568 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.378491 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:29:38.378568 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.378505 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:29:38.378568 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.378480 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:29:38.384282 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.384248 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:29:38.392729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.392813 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-volume\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.392813 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.392899 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-web-config\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.392899 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392816 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:38.392899 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.392935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393135 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.393004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393197 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.393140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwzrm\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-kube-api-access-wwzrm\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.393198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.393244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393377 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.393329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-out\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.393377 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.393355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.493969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.493934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.493985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-volume\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-web-config\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494158 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494202 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwzrm\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-kube-api-access-wwzrm\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-out\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494548 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.494851 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.494595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.495876 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.495543 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.496184 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.496161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.498401 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.498362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.498743 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.498701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.498928 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.498889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.499366 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.499061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.499366 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.499085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-out\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.499366 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.499238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.499366 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.499332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.499624 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.499388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-volume\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.500983 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.500962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-web-config\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.503846 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.503822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwzrm\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-kube-api-access-wwzrm\") pod \"alertmanager-main-0\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.685526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.685492 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:29:38.813004 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.812910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" event={"ID":"6e5f2196-7905-4092-b23c-1f63b39dc528","Type":"ContainerStarted","Data":"befd0ce7c13fe21987590aaa3d3bac4418c24dc354f495d0886441041cebeccb"} Apr 24 21:29:38.814133 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.814096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kkghd" event={"ID":"14046a77-27b2-4686-90d9-2b6f59d97707","Type":"ContainerStarted","Data":"17f201d09b71d6d4178b8b7f332209a4ebe5b0b6df3d63314e9517c3a9f871d4"} Apr 24 21:29:38.815937 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:38.815906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" event={"ID":"cf2ad258-2bb0-493a-98f7-41c11632fd79","Type":"ContainerStarted","Data":"961b030fdc3ddc28f02c11c1a5a860f204df2d544673fa6fa59854f5d879e689"} Apr 24 21:29:39.174567 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:39.174541 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:29:39.177335 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:39.177298 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b2681e_5a49_49f4_8f25_35ce1354f6b5.slice/crio-feecabb3b8449c22cb9222db77466e23a3315940d00f70fd5ee77e2bab8b3c3c WatchSource:0}: Error finding container feecabb3b8449c22cb9222db77466e23a3315940d00f70fd5ee77e2bab8b3c3c: Status 404 returned error can't find the container with id feecabb3b8449c22cb9222db77466e23a3315940d00f70fd5ee77e2bab8b3c3c Apr 24 21:29:39.820006 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:39.819973 2575 generic.go:358] "Generic (PLEG): container finished" podID="14046a77-27b2-4686-90d9-2b6f59d97707" containerID="64deeb2ee4f4c354af3ae12d7536979e7185b1a0cc4129e4047cd7638c13e44a" exitCode=0 Apr 24 21:29:39.820444 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:39.820055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kkghd" event={"ID":"14046a77-27b2-4686-90d9-2b6f59d97707","Type":"ContainerDied","Data":"64deeb2ee4f4c354af3ae12d7536979e7185b1a0cc4129e4047cd7638c13e44a"} Apr 24 21:29:39.821315 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:39.821283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"feecabb3b8449c22cb9222db77466e23a3315940d00f70fd5ee77e2bab8b3c3c"} Apr 24 21:29:39.823047 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:39.823008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" event={"ID":"cf2ad258-2bb0-493a-98f7-41c11632fd79","Type":"ContainerStarted","Data":"993e3ef20ef3656ee44cd306e0cd0188ebd8260b6f45a2dad62081065b318c5c"} Apr 24 21:29:39.858237 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:39.858154 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t4gjz" podStartSLOduration=1.6388674170000002 podStartE2EDuration="2.858135503s" podCreationTimestamp="2026-04-24 21:29:37 +0000 UTC" firstStartedPulling="2026-04-24 21:29:37.81934706 +0000 UTC m=+195.222610034" lastFinishedPulling="2026-04-24 21:29:39.038615144 +0000 UTC m=+196.441878120" observedRunningTime="2026-04-24 21:29:39.856724883 +0000 UTC m=+197.259987877" watchObservedRunningTime="2026-04-24 21:29:39.858135503 +0000 UTC m=+197.261398491" Apr 24 21:29:40.528080 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.528051 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-79bfcc9858-2dmtl"] Apr 24 21:29:40.531676 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.531656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.550176 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.550145 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:29:40.550176 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.550165 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-kkd9v\"" Apr 24 21:29:40.550361 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.550164 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-8dpgrp65hvm4o\"" Apr 24 21:29:40.550361 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.550196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:29:40.550361 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.550152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:29:40.553979 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.553961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:29:40.554106 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.554090 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:29:40.567188 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.567165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79bfcc9858-2dmtl"] Apr 24 21:29:40.609284 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf5c2fee-0804-4913-a2a0-ee03634c1f56-metrics-client-ca\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609284 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-tls\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609428 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609428 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609428 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609530 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-grpc-tls\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609530 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6bk\" (UniqueName: \"kubernetes.io/projected/cf5c2fee-0804-4913-a2a0-ee03634c1f56-kube-api-access-sm6bk\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.609587 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.609538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.710180 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.710357 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf5c2fee-0804-4913-a2a0-ee03634c1f56-metrics-client-ca\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.710357 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-tls\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.710465 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.710465 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.710571 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.711159 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-grpc-tls\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.711159 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.710807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6bk\" (UniqueName: \"kubernetes.io/projected/cf5c2fee-0804-4913-a2a0-ee03634c1f56-kube-api-access-sm6bk\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.711368 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.711334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf5c2fee-0804-4913-a2a0-ee03634c1f56-metrics-client-ca\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.714135 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.713807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.714135 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.713820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-tls\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.714340 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.714212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.714394 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.714357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.714587 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.714549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.714882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.714866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf5c2fee-0804-4913-a2a0-ee03634c1f56-secret-grpc-tls\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.729547 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.729520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6bk\" (UniqueName: \"kubernetes.io/projected/cf5c2fee-0804-4913-a2a0-ee03634c1f56-kube-api-access-sm6bk\") pod \"thanos-querier-79bfcc9858-2dmtl\" (UID: \"cf5c2fee-0804-4913-a2a0-ee03634c1f56\") " pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.827022 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.826989 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="022a61cc8f35e22e8ec058280bf4c8c1b1f8bdd42a82c22b3c9822dd6c73adac" exitCode=0 Apr 24 21:29:40.827465 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.827079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"022a61cc8f35e22e8ec058280bf4c8c1b1f8bdd42a82c22b3c9822dd6c73adac"} Apr 24 21:29:40.829253 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.829225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" event={"ID":"6e5f2196-7905-4092-b23c-1f63b39dc528","Type":"ContainerStarted","Data":"809a1b4d255c1a89c4f81c2c8f6082d1e23d69bae7341336087a0b41c808375e"} Apr 24 21:29:40.829372 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.829279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" event={"ID":"6e5f2196-7905-4092-b23c-1f63b39dc528","Type":"ContainerStarted","Data":"e0131dc22dd7df48d5a22be7a7ec19c53fbdef7fce14872b404dcfea64e49571"} Apr 24 21:29:40.829372 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.829293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" event={"ID":"6e5f2196-7905-4092-b23c-1f63b39dc528","Type":"ContainerStarted","Data":"4663c35a1cbfa52c5cf0a61870c71368202b5cf08581aa3b40cddcbbaf19f0bb"} Apr 24 21:29:40.831245 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.831221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kkghd" event={"ID":"14046a77-27b2-4686-90d9-2b6f59d97707","Type":"ContainerStarted","Data":"e31df59dbbe2417f2a71171e2686770c4c5bb5324646e5012a5c561deabd671e"} Apr 24 21:29:40.831360 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.831252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-kkghd" event={"ID":"14046a77-27b2-4686-90d9-2b6f59d97707","Type":"ContainerStarted","Data":"d08b795559c4f846607061d192b211d0b84c6271f92c662dcaaa2f36a2b03358"} Apr 24 21:29:40.840999 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.840974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:40.895464 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.895419 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-kkghd" podStartSLOduration=3.044636613 podStartE2EDuration="3.895405485s" podCreationTimestamp="2026-04-24 21:29:37 +0000 UTC" firstStartedPulling="2026-04-24 21:29:38.190837803 +0000 UTC m=+195.594100777" lastFinishedPulling="2026-04-24 21:29:39.041606678 +0000 UTC m=+196.444869649" observedRunningTime="2026-04-24 21:29:40.895081426 +0000 UTC m=+198.298344419" watchObservedRunningTime="2026-04-24 21:29:40.895405485 +0000 UTC m=+198.298668471" Apr 24 21:29:40.930271 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.930208 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jrjgl" podStartSLOduration=2.155575517 podStartE2EDuration="3.930189s" podCreationTimestamp="2026-04-24 21:29:37 +0000 UTC" firstStartedPulling="2026-04-24 21:29:38.328737148 +0000 UTC m=+195.732000122" lastFinishedPulling="2026-04-24 21:29:40.103350617 +0000 UTC m=+197.506613605" observedRunningTime="2026-04-24 21:29:40.928575177 +0000 UTC m=+198.331838171" watchObservedRunningTime="2026-04-24 21:29:40.930189 +0000 UTC m=+198.333451994" Apr 24 21:29:40.982476 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:40.982450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79bfcc9858-2dmtl"] Apr 24 21:29:40.983999 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:40.983973 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5c2fee_0804_4913_a2a0_ee03634c1f56.slice/crio-8d2efd27a9c831b2e6b669b4e9c5b06860857e09c758eef9ec01142913f71f17 WatchSource:0}: Error finding container 8d2efd27a9c831b2e6b669b4e9c5b06860857e09c758eef9ec01142913f71f17: Status 404 returned error can't find the container with id 8d2efd27a9c831b2e6b669b4e9c5b06860857e09c758eef9ec01142913f71f17 Apr 24 21:29:41.842233 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.842193 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"8d2efd27a9c831b2e6b669b4e9c5b06860857e09c758eef9ec01142913f71f17"} Apr 24 21:29:41.858323 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.858289 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-54b7659cfc-zjvjv"] Apr 24 21:29:41.862151 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.862125 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.867611 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.867590 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:29:41.867972 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.867952 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:29:41.868745 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.868598 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-prpk9\"" Apr 24 21:29:41.869605 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.869583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-8ngff9t680g65\"" Apr 24 21:29:41.869728 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.869711 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:29:41.869817 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.869753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:29:41.886466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.886425 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-54b7659cfc-zjvjv"] Apr 24 21:29:41.922884 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.922853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-secret-metrics-server-tls\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.923020 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.922941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-audit-log\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.923183 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.923165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-secret-metrics-server-client-certs\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.923243 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.923228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.923298 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.923283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-metrics-server-audit-profiles\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.923340 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.923314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsr2d\" (UniqueName: \"kubernetes.io/projected/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-kube-api-access-nsr2d\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:41.923375 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:41.923348 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-client-ca-bundle\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024331 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-secret-metrics-server-client-certs\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024510 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024510 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-metrics-server-audit-profiles\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024510 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsr2d\" (UniqueName: \"kubernetes.io/projected/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-kube-api-access-nsr2d\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024510 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-client-ca-bundle\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024510 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-secret-metrics-server-tls\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.024758 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-audit-log\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.025252 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.024953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-audit-log\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.026063 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.026014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.026199 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.026176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-metrics-server-audit-profiles\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.027507 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.027463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-secret-metrics-server-client-certs\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.027873 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.027847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-client-ca-bundle\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.029064 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.029033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-secret-metrics-server-tls\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.050961 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.050908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsr2d\" (UniqueName: \"kubernetes.io/projected/80c96c6d-7367-4d83-8102-18bfbb2ad8c8-kube-api-access-nsr2d\") pod \"metrics-server-54b7659cfc-zjvjv\" (UID: \"80c96c6d-7367-4d83-8102-18bfbb2ad8c8\") " pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.175122 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.175090 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:29:42.318361 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.318335 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-54b7659cfc-zjvjv"] Apr 24 21:29:42.777996 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:42.777956 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c96c6d_7367_4d83_8102_18bfbb2ad8c8.slice/crio-beabaeaece82efd4c05b21edb7d513742ff8ceda85b1b5d4d51d6c7769dca894 WatchSource:0}: Error finding container beabaeaece82efd4c05b21edb7d513742ff8ceda85b1b5d4d51d6c7769dca894: Status 404 returned error can't find the container with id beabaeaece82efd4c05b21edb7d513742ff8ceda85b1b5d4d51d6c7769dca894 Apr 24 21:29:42.849619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.849583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"2008bebba1b96ab6c639f1cc5ea1dabfb5ff694b729a59c2bb17f8fc353be9ed"} Apr 24 21:29:42.850786 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:42.850759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" event={"ID":"80c96c6d-7367-4d83-8102-18bfbb2ad8c8","Type":"ContainerStarted","Data":"beabaeaece82efd4c05b21edb7d513742ff8ceda85b1b5d4d51d6c7769dca894"} Apr 24 21:29:43.859413 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.859347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"860a7f4dd16aa93fbb15a9b6ed33d30c155fe356ce3f5c877a9995c5e88b0ba2"} Apr 24 21:29:43.859413 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.859393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"693c95a2dcde14339d079024271ac97ab91001618eb94916aa9bcc31b738fbde"} Apr 24 21:29:43.859413 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.859418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"452aac210787d9695b8fbb32a8d3cd3b4bb40faf18be24d4b09d755d8f31f150"} Apr 24 21:29:43.859918 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.859431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"b49dee828f914c5c4dd4bd5241c80312f45a4b33b8265b619d155b8e32de150b"} Apr 24 21:29:43.861373 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.861344 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"c4e74abce5e5bf70eda274f1a3b4339de9b72e60ea0f3ebdd6a3d628cf61370c"} Apr 24 21:29:43.861488 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.861379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"91379b5b4d3c2270d1947c28463bdee6c705bfdaf17501026fea9e9b57d80e51"} Apr 24 21:29:43.861488 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:43.861393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"89913ff13b3ed207b23641c449830573f0254f8a3f7ae3712aa9e2e548da0aa8"} Apr 24 21:29:44.178030 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.178001 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7df64cf987-gz6g4"] Apr 24 21:29:44.178294 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:29:44.178272 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" podUID="63a472ed-761c-4bea-b845-8b5b620d07ab" Apr 24 21:29:44.866555 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.866471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerStarted","Data":"fe8f3f5b38641384e5cae6e36df1534f45b677f8f7e5936490d0102327aa369c"} Apr 24 21:29:44.868872 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.868845 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"784c5e427100bc3cd716de106871886fb914f23840b37a31962c16698d7c9c08"} Apr 24 21:29:44.868967 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.868879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"bfc2e16c6feba66ff83f919366b2d5c2676ee86da1e819ed9ca1abc1a1306496"} Apr 24 21:29:44.868967 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.868894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" event={"ID":"cf5c2fee-0804-4913-a2a0-ee03634c1f56","Type":"ContainerStarted","Data":"52b4e7df3edf8133543f7cd4cb8c10075fe18b2f9e977e9ee0eccdbe5ac3f87b"} Apr 24 21:29:44.869068 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.869022 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:44.870163 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.870143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:29:44.870163 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.870152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" event={"ID":"80c96c6d-7367-4d83-8102-18bfbb2ad8c8","Type":"ContainerStarted","Data":"c74337ecafafc85ede1a58c7167096b94dc36dd0fb474a2f8b490cbe16eee7ce"} Apr 24 21:29:44.874207 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.874190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:29:44.905642 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.905595 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.848026004 podStartE2EDuration="6.905582299s" podCreationTimestamp="2026-04-24 21:29:38 +0000 UTC" firstStartedPulling="2026-04-24 21:29:39.179427205 +0000 UTC m=+196.582690191" lastFinishedPulling="2026-04-24 21:29:44.236983501 +0000 UTC m=+201.640246486" observedRunningTime="2026-04-24 21:29:44.903476388 +0000 UTC m=+202.306739382" watchObservedRunningTime="2026-04-24 21:29:44.905582299 +0000 UTC m=+202.308845291" Apr 24 21:29:44.930136 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.930085 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" podStartSLOduration=1.680816792 podStartE2EDuration="4.930072072s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="2026-04-24 21:29:40.98587691 +0000 UTC m=+198.389139884" lastFinishedPulling="2026-04-24 21:29:44.235132194 +0000 UTC m=+201.638395164" observedRunningTime="2026-04-24 21:29:44.928853898 +0000 UTC m=+202.332116896" watchObservedRunningTime="2026-04-24 21:29:44.930072072 +0000 UTC m=+202.333335062" Apr 24 21:29:44.950246 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.950190 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" podStartSLOduration=2.49425581 podStartE2EDuration="3.950173977s" podCreationTimestamp="2026-04-24 21:29:41 +0000 UTC" firstStartedPulling="2026-04-24 21:29:42.779774995 +0000 UTC m=+200.183037968" lastFinishedPulling="2026-04-24 21:29:44.235693161 +0000 UTC m=+201.638956135" observedRunningTime="2026-04-24 21:29:44.949333728 +0000 UTC m=+202.352596720" watchObservedRunningTime="2026-04-24 21:29:44.950173977 +0000 UTC m=+202.353436954" Apr 24 21:29:44.951129 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951111 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-image-registry-private-configuration\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951211 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951150 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-certificates\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951211 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951200 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-trusted-ca\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951330 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951232 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqjgq\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-kube-api-access-bqjgq\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951441 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951424 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-bound-sa-token\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951561 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951548 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-installation-pull-secrets\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951716 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63a472ed-761c-4bea-b845-8b5b620d07ab-ca-trust-extracted\") pod \"63a472ed-761c-4bea-b845-8b5b620d07ab\" (UID: \"63a472ed-761c-4bea-b845-8b5b620d07ab\") " Apr 24 21:29:44.951917 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951711 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:44.951917 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.951890 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a472ed-761c-4bea-b845-8b5b620d07ab-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:44.952082 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.952059 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:44.953053 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.953035 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-certificates\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:44.953209 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.953193 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63a472ed-761c-4bea-b845-8b5b620d07ab-trusted-ca\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:44.953413 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.953398 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63a472ed-761c-4bea-b845-8b5b620d07ab-ca-trust-extracted\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:44.954472 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.954091 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:44.954887 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.954864 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:44.955096 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.955067 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:44.955578 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:44.955547 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-kube-api-access-bqjgq" (OuterVolumeSpecName: "kube-api-access-bqjgq") pod "63a472ed-761c-4bea-b845-8b5b620d07ab" (UID: "63a472ed-761c-4bea-b845-8b5b620d07ab"). InnerVolumeSpecName "kube-api-access-bqjgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:45.054324 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.054293 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-image-registry-private-configuration\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:45.054324 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.054321 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqjgq\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-kube-api-access-bqjgq\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:45.054324 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.054331 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-bound-sa-token\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:45.054547 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.054340 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63a472ed-761c-4bea-b845-8b5b620d07ab-installation-pull-secrets\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:45.872927 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.872892 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7df64cf987-gz6g4" Apr 24 21:29:45.914838 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.914806 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7df64cf987-gz6g4"] Apr 24 21:29:45.923123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.920700 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7df64cf987-gz6g4"] Apr 24 21:29:45.961483 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:45.961452 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63a472ed-761c-4bea-b845-8b5b620d07ab-registry-tls\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:29:47.218955 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:47.218920 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a472ed-761c-4bea-b845-8b5b620d07ab" path="/var/lib/kubelet/pods/63a472ed-761c-4bea-b845-8b5b620d07ab/volumes" Apr 24 21:29:50.880234 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:50.880208 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-79bfcc9858-2dmtl" Apr 24 21:29:51.215121 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.215085 2575 scope.go:117] "RemoveContainer" containerID="d187b68777c231e0e97639e78d03dd2a1eb5c62bb4723e6d494ba414c8ed7c5d" Apr 24 21:29:51.893678 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.893649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:29:51.894051 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.893749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" event={"ID":"057fa386-6c87-478a-91d9-c2293ba0617c","Type":"ContainerStarted","Data":"025cade8df3b058c0e204b1b5e9f0b127875d8913216b5a9bf6cb8fd1835e761"} Apr 24 21:29:51.894051 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.894013 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:29:51.899754 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.899722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" Apr 24 21:29:51.925021 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.924963 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-mksv7" podStartSLOduration=55.682748509 podStartE2EDuration="57.924947472s" podCreationTimestamp="2026-04-24 21:28:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:54.499539264 +0000 UTC m=+151.902802238" lastFinishedPulling="2026-04-24 21:28:56.741738226 +0000 UTC m=+154.145001201" observedRunningTime="2026-04-24 21:29:51.922851619 +0000 UTC m=+209.326114612" watchObservedRunningTime="2026-04-24 21:29:51.924947472 +0000 UTC m=+209.328210466" Apr 24 21:29:51.956417 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.956381 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-dp4l7"] Apr 24 21:29:51.959665 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.959645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:29:51.962891 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.962872 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:29:51.963123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.963106 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:29:51.963123 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.963126 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-vj878\"" Apr 24 21:29:51.971159 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:51.971137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dp4l7"] Apr 24 21:29:52.010583 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:52.010560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsmn6\" (UniqueName: \"kubernetes.io/projected/ea0e6e62-3775-4664-9ee5-50d8d29af322-kube-api-access-xsmn6\") pod \"downloads-6bcc868b7-dp4l7\" (UID: \"ea0e6e62-3775-4664-9ee5-50d8d29af322\") " pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:29:52.111828 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:52.111790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsmn6\" (UniqueName: \"kubernetes.io/projected/ea0e6e62-3775-4664-9ee5-50d8d29af322-kube-api-access-xsmn6\") pod \"downloads-6bcc868b7-dp4l7\" (UID: \"ea0e6e62-3775-4664-9ee5-50d8d29af322\") " pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:29:52.123228 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:52.123203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsmn6\" (UniqueName: \"kubernetes.io/projected/ea0e6e62-3775-4664-9ee5-50d8d29af322-kube-api-access-xsmn6\") pod \"downloads-6bcc868b7-dp4l7\" (UID: \"ea0e6e62-3775-4664-9ee5-50d8d29af322\") " pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:29:52.270033 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:52.269942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:29:52.388783 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:52.388628 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dp4l7"] Apr 24 21:29:52.391409 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:52.391377 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0e6e62_3775_4664_9ee5_50d8d29af322.slice/crio-a63f25bd36d97c67572a2b1e498408c90f52495f92f2e35b7764539cbd188233 WatchSource:0}: Error finding container a63f25bd36d97c67572a2b1e498408c90f52495f92f2e35b7764539cbd188233: Status 404 returned error can't find the container with id a63f25bd36d97c67572a2b1e498408c90f52495f92f2e35b7764539cbd188233 Apr 24 21:29:52.898223 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:52.898187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dp4l7" event={"ID":"ea0e6e62-3775-4664-9ee5-50d8d29af322","Type":"ContainerStarted","Data":"a63f25bd36d97c67572a2b1e498408c90f52495f92f2e35b7764539cbd188233"} Apr 24 21:29:58.092247 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.092205 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5984f5fbdd-tc262"] Apr 24 21:29:58.096202 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.096180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.098838 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.098806 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:29:58.099119 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.099087 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:29:58.099233 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.099116 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:29:58.099233 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.099187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xhl4w\"" Apr 24 21:29:58.099375 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.099270 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:29:58.099431 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.099391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:29:58.113845 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.113820 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5984f5fbdd-tc262"] Apr 24 21:29:58.169090 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.169054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-serving-cert\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.169302 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.169104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-oauth-config\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.169302 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.169228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-console-config\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.169302 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.169287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-oauth-serving-cert\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.169454 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.169337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqm2\" (UniqueName: \"kubernetes.io/projected/8444c1b6-0644-4e6b-a984-83820a012a2c-kube-api-access-rlqm2\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.169454 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.169376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-service-ca\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.270720 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.270677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-oauth-serving-cert\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.270890 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.270754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqm2\" (UniqueName: \"kubernetes.io/projected/8444c1b6-0644-4e6b-a984-83820a012a2c-kube-api-access-rlqm2\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.270890 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.270794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-service-ca\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.270890 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.270821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-serving-cert\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.270890 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.270862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-oauth-config\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.271104 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.270948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-console-config\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.271557 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.271528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-oauth-serving-cert\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.271717 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.271594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-console-config\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.271790 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.271749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-service-ca\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.273879 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.273852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-oauth-config\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.273983 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.273900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-serving-cert\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.282249 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.282227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqm2\" (UniqueName: \"kubernetes.io/projected/8444c1b6-0644-4e6b-a984-83820a012a2c-kube-api-access-rlqm2\") pod \"console-5984f5fbdd-tc262\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.407328 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.407298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:29:58.560659 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.560627 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5984f5fbdd-tc262"] Apr 24 21:29:58.570351 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:29:58.570316 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8444c1b6_0644_4e6b_a984_83820a012a2c.slice/crio-033ddc0aba92b6b703a326c116a98adf1517431f4d633fd68b9c17aa8afaaeb9 WatchSource:0}: Error finding container 033ddc0aba92b6b703a326c116a98adf1517431f4d633fd68b9c17aa8afaaeb9: Status 404 returned error can't find the container with id 033ddc0aba92b6b703a326c116a98adf1517431f4d633fd68b9c17aa8afaaeb9 Apr 24 21:29:58.917882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:29:58.917842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5984f5fbdd-tc262" event={"ID":"8444c1b6-0644-4e6b-a984-83820a012a2c","Type":"ContainerStarted","Data":"033ddc0aba92b6b703a326c116a98adf1517431f4d633fd68b9c17aa8afaaeb9"} Apr 24 21:30:02.175979 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:02.175941 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:30:02.176483 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:02.175992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:30:07.344001 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.343960 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-547fdbb86f-s6v5z"] Apr 24 21:30:07.347367 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.347333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.360279 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.360236 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:30:07.367648 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.367622 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547fdbb86f-s6v5z"] Apr 24 21:30:07.455591 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-trusted-ca-bundle\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.455591 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4d8\" (UniqueName: \"kubernetes.io/projected/3aef274d-51a0-4545-8b71-94a8ff952a0d-kube-api-access-sv4d8\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.455855 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-service-ca\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.455855 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-oauth-config\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.455855 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-oauth-serving-cert\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.456004 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-serving-cert\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.456004 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.455921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-config\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.556710 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.556668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-trusted-ca-bundle\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.556893 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.556722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv4d8\" (UniqueName: \"kubernetes.io/projected/3aef274d-51a0-4545-8b71-94a8ff952a0d-kube-api-access-sv4d8\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.556893 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.556790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-service-ca\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.557012 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.556890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-oauth-config\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.557012 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.556949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-oauth-serving-cert\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.557012 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.556980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-serving-cert\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.557164 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.557024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-config\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.557729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.557689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-trusted-ca-bundle\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.557923 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.557822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-oauth-serving-cert\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.558182 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.558160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-config\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.558327 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.558307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-service-ca\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.559673 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.559643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-oauth-config\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.559928 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.559907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-serving-cert\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.566054 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.566031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv4d8\" (UniqueName: \"kubernetes.io/projected/3aef274d-51a0-4545-8b71-94a8ff952a0d-kube-api-access-sv4d8\") pod \"console-547fdbb86f-s6v5z\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:07.658404 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:07.658302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:08.417112 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.417086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547fdbb86f-s6v5z"] Apr 24 21:30:08.420036 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:30:08.420014 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aef274d_51a0_4545_8b71_94a8ff952a0d.slice/crio-edc2567746881b2bdc9e671177666db2f35ec141b25585471329fc585d8889ee WatchSource:0}: Error finding container edc2567746881b2bdc9e671177666db2f35ec141b25585471329fc585d8889ee: Status 404 returned error can't find the container with id edc2567746881b2bdc9e671177666db2f35ec141b25585471329fc585d8889ee Apr 24 21:30:08.951955 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.951912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5984f5fbdd-tc262" event={"ID":"8444c1b6-0644-4e6b-a984-83820a012a2c","Type":"ContainerStarted","Data":"fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f"} Apr 24 21:30:08.953540 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.953508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dp4l7" event={"ID":"ea0e6e62-3775-4664-9ee5-50d8d29af322","Type":"ContainerStarted","Data":"707c36e4d27e13e53c8a03d0bc659ec4b9739d4773cffb2ed592757f44d32aaa"} Apr 24 21:30:08.953827 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.953807 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:30:08.955122 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.955092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547fdbb86f-s6v5z" event={"ID":"3aef274d-51a0-4545-8b71-94a8ff952a0d","Type":"ContainerStarted","Data":"4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372"} Apr 24 21:30:08.955239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.955124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547fdbb86f-s6v5z" event={"ID":"3aef274d-51a0-4545-8b71-94a8ff952a0d","Type":"ContainerStarted","Data":"edc2567746881b2bdc9e671177666db2f35ec141b25585471329fc585d8889ee"} Apr 24 21:30:08.971792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.971749 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5984f5fbdd-tc262" podStartSLOduration=1.262595487 podStartE2EDuration="10.971735374s" podCreationTimestamp="2026-04-24 21:29:58 +0000 UTC" firstStartedPulling="2026-04-24 21:29:58.572531338 +0000 UTC m=+215.975794315" lastFinishedPulling="2026-04-24 21:30:08.281671221 +0000 UTC m=+225.684934202" observedRunningTime="2026-04-24 21:30:08.971178175 +0000 UTC m=+226.374441168" watchObservedRunningTime="2026-04-24 21:30:08.971735374 +0000 UTC m=+226.374998367" Apr 24 21:30:08.971904 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.971810 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-dp4l7" Apr 24 21:30:08.991988 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:08.991942 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-547fdbb86f-s6v5z" podStartSLOduration=1.991923662 podStartE2EDuration="1.991923662s" podCreationTimestamp="2026-04-24 21:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:08.98973871 +0000 UTC m=+226.393001705" watchObservedRunningTime="2026-04-24 21:30:08.991923662 +0000 UTC m=+226.395186687" Apr 24 21:30:09.011182 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:09.011106 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-dp4l7" podStartSLOduration=2.062148819 podStartE2EDuration="18.011088713s" podCreationTimestamp="2026-04-24 21:29:51 +0000 UTC" firstStartedPulling="2026-04-24 21:29:52.393578308 +0000 UTC m=+209.796841282" lastFinishedPulling="2026-04-24 21:30:08.342518203 +0000 UTC m=+225.745781176" observedRunningTime="2026-04-24 21:30:09.008729656 +0000 UTC m=+226.411992662" watchObservedRunningTime="2026-04-24 21:30:09.011088713 +0000 UTC m=+226.414351705" Apr 24 21:30:17.658555 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.658522 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:17.658989 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.658567 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:17.663201 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.663180 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:17.982746 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.982652 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ce69927-aa5a-4504-8d79-b16c4a102d54" containerID="bb2ac1e927420dbfd8a5a2ead6e974a23ac3b86faaeb94468f1235ddb12f27b1" exitCode=0 Apr 24 21:30:17.982746 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.982730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" event={"ID":"1ce69927-aa5a-4504-8d79-b16c4a102d54","Type":"ContainerDied","Data":"bb2ac1e927420dbfd8a5a2ead6e974a23ac3b86faaeb94468f1235ddb12f27b1"} Apr 24 21:30:17.983190 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.983158 2575 scope.go:117] "RemoveContainer" containerID="bb2ac1e927420dbfd8a5a2ead6e974a23ac3b86faaeb94468f1235ddb12f27b1" Apr 24 21:30:17.986889 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:17.986870 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:30:18.074076 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:18.074052 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5984f5fbdd-tc262"] Apr 24 21:30:18.408205 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:18.408178 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:30:18.987063 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:18.987030 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zw7ml" event={"ID":"1ce69927-aa5a-4504-8d79-b16c4a102d54","Type":"ContainerStarted","Data":"408c1c62791d36c3a2901befa5c606b0497dbbeaa630a142b2a53d611e8a2f76"} Apr 24 21:30:19.100189 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:19.100160 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/init-config-reloader/0.log" Apr 24 21:30:19.105720 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:19.105697 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/alertmanager/0.log" Apr 24 21:30:19.277173 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:19.277080 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/config-reloader/0.log" Apr 24 21:30:19.469177 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:19.469143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/kube-rbac-proxy-web/0.log" Apr 24 21:30:19.669372 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:19.669328 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/kube-rbac-proxy/0.log" Apr 24 21:30:19.869072 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:19.869020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/kube-rbac-proxy-metric/0.log" Apr 24 21:30:20.069033 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:20.068960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_73b2681e-5a49-49f4-8f25-35ce1354f6b5/prom-label-proxy/0.log" Apr 24 21:30:20.469621 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:20.469588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jrjgl_6e5f2196-7905-4092-b23c-1f63b39dc528/kube-state-metrics/0.log" Apr 24 21:30:20.668618 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:20.668561 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jrjgl_6e5f2196-7905-4092-b23c-1f63b39dc528/kube-rbac-proxy-main/0.log" Apr 24 21:30:20.868924 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:20.868851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jrjgl_6e5f2196-7905-4092-b23c-1f63b39dc528/kube-rbac-proxy-self/0.log" Apr 24 21:30:21.069523 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:21.069496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-54b7659cfc-zjvjv_80c96c6d-7367-4d83-8102-18bfbb2ad8c8/metrics-server/0.log" Apr 24 21:30:22.068625 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:22.068597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kkghd_14046a77-27b2-4686-90d9-2b6f59d97707/init-textfile/0.log" Apr 24 21:30:22.181553 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:22.181526 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:30:22.185305 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:22.185281 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-54b7659cfc-zjvjv" Apr 24 21:30:22.269342 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:22.269303 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kkghd_14046a77-27b2-4686-90d9-2b6f59d97707/node-exporter/0.log" Apr 24 21:30:22.468677 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:22.468637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kkghd_14046a77-27b2-4686-90d9-2b6f59d97707/kube-rbac-proxy/0.log" Apr 24 21:30:23.269007 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:23.268957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t4gjz_cf2ad258-2bb0-493a-98f7-41c11632fd79/kube-rbac-proxy-main/0.log" Apr 24 21:30:23.468586 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:23.468555 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t4gjz_cf2ad258-2bb0-493a-98f7-41c11632fd79/kube-rbac-proxy-self/0.log" Apr 24 21:30:23.670305 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:23.670270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t4gjz_cf2ad258-2bb0-493a-98f7-41c11632fd79/openshift-state-metrics/0.log" Apr 24 21:30:25.672074 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:25.672016 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xghjh_8edc3df6-4a9e-45bf-bfea-3bbff392cd1d/prometheus-operator-admission-webhook/0.log" Apr 24 21:30:26.469244 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:26.469217 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/thanos-query/0.log" Apr 24 21:30:26.669848 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:26.669811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy-web/0.log" Apr 24 21:30:26.870011 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:26.869908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy/0.log" Apr 24 21:30:27.069381 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:27.069352 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/prom-label-proxy/0.log" Apr 24 21:30:27.269428 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:27.269388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy-rules/0.log" Apr 24 21:30:27.468656 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:27.468622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy-metrics/0.log" Apr 24 21:30:27.869027 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:27.868997 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:30:28.070810 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:28.070778 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/3.log" Apr 24 21:30:28.270669 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:28.270640 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547fdbb86f-s6v5z_3aef274d-51a0-4545-8b71-94a8ff952a0d/console/0.log" Apr 24 21:30:28.468673 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:28.468649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5984f5fbdd-tc262_8444c1b6-0644-4e6b-a984-83820a012a2c/console/0.log" Apr 24 21:30:28.672763 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:28.672737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-dp4l7_ea0e6e62-3775-4664-9ee5-50d8d29af322/download-server/0.log" Apr 24 21:30:34.009476 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:34.009447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:30:34.011666 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:34.011638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ad3b358-912b-477a-8fc3-6f2910580c33-metrics-certs\") pod \"network-metrics-daemon-nz6dq\" (UID: \"3ad3b358-912b-477a-8fc3-6f2910580c33\") " pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:30:34.224674 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:34.224641 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vlmsd\"" Apr 24 21:30:34.229878 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:34.229863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nz6dq" Apr 24 21:30:34.345969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:34.345938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nz6dq"] Apr 24 21:30:34.348777 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:30:34.348743 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad3b358_912b_477a_8fc3_6f2910580c33.slice/crio-530ef9f57dbf4e551f8defc1b5ca96fea42bc25b93ee8d2e5c81895a51f24f45 WatchSource:0}: Error finding container 530ef9f57dbf4e551f8defc1b5ca96fea42bc25b93ee8d2e5c81895a51f24f45: Status 404 returned error can't find the container with id 530ef9f57dbf4e551f8defc1b5ca96fea42bc25b93ee8d2e5c81895a51f24f45 Apr 24 21:30:35.037934 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:35.037899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nz6dq" event={"ID":"3ad3b358-912b-477a-8fc3-6f2910580c33","Type":"ContainerStarted","Data":"530ef9f57dbf4e551f8defc1b5ca96fea42bc25b93ee8d2e5c81895a51f24f45"} Apr 24 21:30:36.042806 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:36.042771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nz6dq" event={"ID":"3ad3b358-912b-477a-8fc3-6f2910580c33","Type":"ContainerStarted","Data":"6a0ecdba9bc4c12c677f80200e9bf9e21bbbf165a22f75ea79f2942647be5c23"} Apr 24 21:30:37.048327 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:37.048289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nz6dq" event={"ID":"3ad3b358-912b-477a-8fc3-6f2910580c33","Type":"ContainerStarted","Data":"321c63654d1efb76a209e9d758b85ec68cfd6317411b239c2b22dcc506f21978"} Apr 24 21:30:37.066684 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:37.066631 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nz6dq" podStartSLOduration=252.649287427 podStartE2EDuration="4m14.066614821s" podCreationTimestamp="2026-04-24 21:26:23 +0000 UTC" firstStartedPulling="2026-04-24 21:30:34.350653425 +0000 UTC m=+251.753916410" lastFinishedPulling="2026-04-24 21:30:35.767980831 +0000 UTC m=+253.171243804" observedRunningTime="2026-04-24 21:30:37.064324988 +0000 UTC m=+254.467587980" watchObservedRunningTime="2026-04-24 21:30:37.066614821 +0000 UTC m=+254.469877814" Apr 24 21:30:43.094036 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.093974 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5984f5fbdd-tc262" podUID="8444c1b6-0644-4e6b-a984-83820a012a2c" containerName="console" containerID="cri-o://fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f" gracePeriod=15 Apr 24 21:30:43.367331 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.367311 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5984f5fbdd-tc262_8444c1b6-0644-4e6b-a984-83820a012a2c/console/0.log" Apr 24 21:30:43.367444 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.367369 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:30:43.498099 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498065 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlqm2\" (UniqueName: \"kubernetes.io/projected/8444c1b6-0644-4e6b-a984-83820a012a2c-kube-api-access-rlqm2\") pod \"8444c1b6-0644-4e6b-a984-83820a012a2c\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " Apr 24 21:30:43.498099 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498103 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-serving-cert\") pod \"8444c1b6-0644-4e6b-a984-83820a012a2c\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " Apr 24 21:30:43.498373 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498124 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-oauth-serving-cert\") pod \"8444c1b6-0644-4e6b-a984-83820a012a2c\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " Apr 24 21:30:43.498373 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498159 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-console-config\") pod \"8444c1b6-0644-4e6b-a984-83820a012a2c\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " Apr 24 21:30:43.498373 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498218 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-service-ca\") pod \"8444c1b6-0644-4e6b-a984-83820a012a2c\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " Apr 24 21:30:43.498373 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498304 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-oauth-config\") pod \"8444c1b6-0644-4e6b-a984-83820a012a2c\" (UID: \"8444c1b6-0644-4e6b-a984-83820a012a2c\") " Apr 24 21:30:43.498638 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498607 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-console-config" (OuterVolumeSpecName: "console-config") pod "8444c1b6-0644-4e6b-a984-83820a012a2c" (UID: "8444c1b6-0644-4e6b-a984-83820a012a2c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:43.498717 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498654 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-service-ca" (OuterVolumeSpecName: "service-ca") pod "8444c1b6-0644-4e6b-a984-83820a012a2c" (UID: "8444c1b6-0644-4e6b-a984-83820a012a2c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:43.498717 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.498652 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8444c1b6-0644-4e6b-a984-83820a012a2c" (UID: "8444c1b6-0644-4e6b-a984-83820a012a2c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:43.500526 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.500487 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8444c1b6-0644-4e6b-a984-83820a012a2c" (UID: "8444c1b6-0644-4e6b-a984-83820a012a2c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:43.500635 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.500610 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8444c1b6-0644-4e6b-a984-83820a012a2c" (UID: "8444c1b6-0644-4e6b-a984-83820a012a2c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:43.500635 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.500613 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8444c1b6-0644-4e6b-a984-83820a012a2c-kube-api-access-rlqm2" (OuterVolumeSpecName: "kube-api-access-rlqm2") pod "8444c1b6-0644-4e6b-a984-83820a012a2c" (UID: "8444c1b6-0644-4e6b-a984-83820a012a2c"). InnerVolumeSpecName "kube-api-access-rlqm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:43.599671 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.599645 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-service-ca\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.599671 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.599665 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-oauth-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.599671 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.599675 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rlqm2\" (UniqueName: \"kubernetes.io/projected/8444c1b6-0644-4e6b-a984-83820a012a2c-kube-api-access-rlqm2\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.599856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.599684 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8444c1b6-0644-4e6b-a984-83820a012a2c-console-serving-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.599856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.599693 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-oauth-serving-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:43.599856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:43.599702 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8444c1b6-0644-4e6b-a984-83820a012a2c-console-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:44.071783 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.071758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5984f5fbdd-tc262_8444c1b6-0644-4e6b-a984-83820a012a2c/console/0.log" Apr 24 21:30:44.071969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.071799 2575 generic.go:358] "Generic (PLEG): container finished" podID="8444c1b6-0644-4e6b-a984-83820a012a2c" containerID="fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f" exitCode=2 Apr 24 21:30:44.071969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.071870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5984f5fbdd-tc262" event={"ID":"8444c1b6-0644-4e6b-a984-83820a012a2c","Type":"ContainerDied","Data":"fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f"} Apr 24 21:30:44.071969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.071876 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5984f5fbdd-tc262" Apr 24 21:30:44.071969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.071895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5984f5fbdd-tc262" event={"ID":"8444c1b6-0644-4e6b-a984-83820a012a2c","Type":"ContainerDied","Data":"033ddc0aba92b6b703a326c116a98adf1517431f4d633fd68b9c17aa8afaaeb9"} Apr 24 21:30:44.071969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.071911 2575 scope.go:117] "RemoveContainer" containerID="fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f" Apr 24 21:30:44.080036 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.080018 2575 scope.go:117] "RemoveContainer" containerID="fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f" Apr 24 21:30:44.080334 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:30:44.080311 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f\": container with ID starting with fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f not found: ID does not exist" containerID="fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f" Apr 24 21:30:44.080412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.080340 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f"} err="failed to get container status \"fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f\": rpc error: code = NotFound desc = could not find container \"fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f\": container with ID starting with fb1b2830c4e79e351fc7901594114450021dd32a7eb8842091d2c71e60ca404f not found: ID does not exist" Apr 24 21:30:44.095819 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.095795 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5984f5fbdd-tc262"] Apr 24 21:30:44.099722 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:44.099702 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5984f5fbdd-tc262"] Apr 24 21:30:45.219043 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:45.219008 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8444c1b6-0644-4e6b-a984-83820a012a2c" path="/var/lib/kubelet/pods/8444c1b6-0644-4e6b-a984-83820a012a2c/volumes" Apr 24 21:30:57.830818 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.830781 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:30:57.831294 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.831236 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="alertmanager" containerID="cri-o://2008bebba1b96ab6c639f1cc5ea1dabfb5ff694b729a59c2bb17f8fc353be9ed" gracePeriod=120 Apr 24 21:30:57.831380 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.831317 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-metric" containerID="cri-o://860a7f4dd16aa93fbb15a9b6ed33d30c155fe356ce3f5c877a9995c5e88b0ba2" gracePeriod=120 Apr 24 21:30:57.831380 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.831367 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="prom-label-proxy" containerID="cri-o://fe8f3f5b38641384e5cae6e36df1534f45b677f8f7e5936490d0102327aa369c" gracePeriod=120 Apr 24 21:30:57.831491 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.831318 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-web" containerID="cri-o://452aac210787d9695b8fbb32a8d3cd3b4bb40faf18be24d4b09d755d8f31f150" gracePeriod=120 Apr 24 21:30:57.831491 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.831421 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy" containerID="cri-o://693c95a2dcde14339d079024271ac97ab91001618eb94916aa9bcc31b738fbde" gracePeriod=120 Apr 24 21:30:57.831491 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:57.831350 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="config-reloader" containerID="cri-o://b49dee828f914c5c4dd4bd5241c80312f45a4b33b8265b619d155b8e32de150b" gracePeriod=120 Apr 24 21:30:58.114936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114856 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="fe8f3f5b38641384e5cae6e36df1534f45b677f8f7e5936490d0102327aa369c" exitCode=0 Apr 24 21:30:58.114936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114881 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="693c95a2dcde14339d079024271ac97ab91001618eb94916aa9bcc31b738fbde" exitCode=0 Apr 24 21:30:58.114936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114888 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="b49dee828f914c5c4dd4bd5241c80312f45a4b33b8265b619d155b8e32de150b" exitCode=0 Apr 24 21:30:58.114936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114894 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="2008bebba1b96ab6c639f1cc5ea1dabfb5ff694b729a59c2bb17f8fc353be9ed" exitCode=0 Apr 24 21:30:58.115161 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"fe8f3f5b38641384e5cae6e36df1534f45b677f8f7e5936490d0102327aa369c"} Apr 24 21:30:58.115161 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"693c95a2dcde14339d079024271ac97ab91001618eb94916aa9bcc31b738fbde"} Apr 24 21:30:58.115161 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"b49dee828f914c5c4dd4bd5241c80312f45a4b33b8265b619d155b8e32de150b"} Apr 24 21:30:58.115161 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:58.114990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"2008bebba1b96ab6c639f1cc5ea1dabfb5ff694b729a59c2bb17f8fc353be9ed"} Apr 24 21:30:59.121427 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.121386 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="860a7f4dd16aa93fbb15a9b6ed33d30c155fe356ce3f5c877a9995c5e88b0ba2" exitCode=0 Apr 24 21:30:59.121427 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.121416 2575 generic.go:358] "Generic (PLEG): container finished" podID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerID="452aac210787d9695b8fbb32a8d3cd3b4bb40faf18be24d4b09d755d8f31f150" exitCode=0 Apr 24 21:30:59.121815 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.121442 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"860a7f4dd16aa93fbb15a9b6ed33d30c155fe356ce3f5c877a9995c5e88b0ba2"} Apr 24 21:30:59.121815 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.121474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"452aac210787d9695b8fbb32a8d3cd3b4bb40faf18be24d4b09d755d8f31f150"} Apr 24 21:30:59.564415 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.564051 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:30:59.564415 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.564400 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7657f89d8-knsrl"] Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.564986 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-metric" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565009 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-metric" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565023 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565032 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565047 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="alertmanager" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565056 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="alertmanager" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565067 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-web" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565094 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-web" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565110 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="config-reloader" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565118 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="config-reloader" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565128 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="prom-label-proxy" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565136 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="prom-label-proxy" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565152 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="init-config-reloader" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565160 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="init-config-reloader" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565170 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8444c1b6-0644-4e6b-a984-83820a012a2c" containerName="console" Apr 24 21:30:59.565299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565179 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8444c1b6-0644-4e6b-a984-83820a012a2c" containerName="console" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565315 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="alertmanager" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565332 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="config-reloader" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565342 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-metric" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565351 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="prom-label-proxy" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565360 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8444c1b6-0644-4e6b-a984-83820a012a2c" containerName="console" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565373 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy" Apr 24 21:30:59.566224 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.565382 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" containerName="kube-rbac-proxy-web" Apr 24 21:30:59.568821 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.568781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.583454 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.583429 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7657f89d8-knsrl"] Apr 24 21:30:59.638154 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638133 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwzrm\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-kube-api-access-wwzrm\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638166 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638187 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-main-db\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638235 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-web-config\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638275 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-cluster-tls-config\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638339 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-main-tls\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638363 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-metrics-client-ca\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638399 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638435 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638461 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-trusted-ca-bundle\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638489 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-volume\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638513 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-out\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638541 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:59.638619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638552 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-tls-assets\") pod \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\" (UID: \"73b2681e-5a49-49f4-8f25-35ce1354f6b5\") " Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-service-ca\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmptm\" (UniqueName: \"kubernetes.io/projected/2b052089-9e8d-4e46-8ee9-849d8cc462a2-kube-api-access-bmptm\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-config\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-serving-cert\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-oauth-serving-cert\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.638908 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:59.639025 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.639019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-trusted-ca-bundle\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639433 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.639054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-oauth-config\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.639433 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.639115 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.639433 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.639133 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-alertmanager-main-db\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.639725 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.639675 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:59.642581 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.642550 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.642679 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.642579 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:59.642679 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.642649 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.642679 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.642664 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.642886 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.642785 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.643047 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.643028 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-out" (OuterVolumeSpecName: "config-out") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:59.643272 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.643232 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-kube-api-access-wwzrm" (OuterVolumeSpecName: "kube-api-access-wwzrm") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "kube-api-access-wwzrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:59.643515 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.643479 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.647608 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.647586 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.653459 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.653433 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-web-config" (OuterVolumeSpecName: "web-config") pod "73b2681e-5a49-49f4-8f25-35ce1354f6b5" (UID: "73b2681e-5a49-49f4-8f25-35ce1354f6b5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:59.740091 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-serving-cert\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740091 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-oauth-serving-cert\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740354 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-trusted-ca-bundle\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740354 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-oauth-config\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740354 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-service-ca\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740354 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmptm\" (UniqueName: \"kubernetes.io/projected/2b052089-9e8d-4e46-8ee9-849d8cc462a2-kube-api-access-bmptm\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-config\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740478 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-web-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740494 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-cluster-tls-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740509 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-main-tls\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740523 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73b2681e-5a49-49f4-8f25-35ce1354f6b5-metrics-client-ca\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740538 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740562 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740556 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740571 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-volume\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740584 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73b2681e-5a49-49f4-8f25-35ce1354f6b5-config-out\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740596 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-tls-assets\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740611 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wwzrm\" (UniqueName: \"kubernetes.io/projected/73b2681e-5a49-49f4-8f25-35ce1354f6b5-kube-api-access-wwzrm\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740626 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73b2681e-5a49-49f4-8f25-35ce1354f6b5-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:30:59.740882 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.740820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-oauth-serving-cert\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.741125 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.741012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-service-ca\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.741219 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.741190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-config\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.741463 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.741440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-trusted-ca-bundle\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.742696 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.742671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-serving-cert\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.742770 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.742724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-oauth-config\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.747837 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.747817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmptm\" (UniqueName: \"kubernetes.io/projected/2b052089-9e8d-4e46-8ee9-849d8cc462a2-kube-api-access-bmptm\") pod \"console-7657f89d8-knsrl\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:30:59.877884 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:30:59.877838 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:31:00.008801 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.008717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7657f89d8-knsrl"] Apr 24 21:31:00.011129 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:31:00.011102 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b052089_9e8d_4e46_8ee9_849d8cc462a2.slice/crio-0f0c2f3c7b8bbd41f03417bdef650e1e876958e952cadaa04bef11c3772ca66e WatchSource:0}: Error finding container 0f0c2f3c7b8bbd41f03417bdef650e1e876958e952cadaa04bef11c3772ca66e: Status 404 returned error can't find the container with id 0f0c2f3c7b8bbd41f03417bdef650e1e876958e952cadaa04bef11c3772ca66e Apr 24 21:31:00.128229 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.128020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"73b2681e-5a49-49f4-8f25-35ce1354f6b5","Type":"ContainerDied","Data":"feecabb3b8449c22cb9222db77466e23a3315940d00f70fd5ee77e2bab8b3c3c"} Apr 24 21:31:00.128229 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.128061 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.128229 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.128087 2575 scope.go:117] "RemoveContainer" containerID="fe8f3f5b38641384e5cae6e36df1534f45b677f8f7e5936490d0102327aa369c" Apr 24 21:31:00.133581 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.133550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7657f89d8-knsrl" event={"ID":"2b052089-9e8d-4e46-8ee9-849d8cc462a2","Type":"ContainerStarted","Data":"20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae"} Apr 24 21:31:00.133709 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.133589 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7657f89d8-knsrl" event={"ID":"2b052089-9e8d-4e46-8ee9-849d8cc462a2","Type":"ContainerStarted","Data":"0f0c2f3c7b8bbd41f03417bdef650e1e876958e952cadaa04bef11c3772ca66e"} Apr 24 21:31:00.139278 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.139233 2575 scope.go:117] "RemoveContainer" containerID="860a7f4dd16aa93fbb15a9b6ed33d30c155fe356ce3f5c877a9995c5e88b0ba2" Apr 24 21:31:00.146973 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.146953 2575 scope.go:117] "RemoveContainer" containerID="693c95a2dcde14339d079024271ac97ab91001618eb94916aa9bcc31b738fbde" Apr 24 21:31:00.150116 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:31:00.150093 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b2681e_5a49_49f4_8f25_35ce1354f6b5.slice\": RecentStats: unable to find data in memory cache]" Apr 24 21:31:00.153936 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.153881 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7657f89d8-knsrl" podStartSLOduration=1.153866597 podStartE2EDuration="1.153866597s" podCreationTimestamp="2026-04-24 21:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:00.152311386 +0000 UTC m=+277.555574379" watchObservedRunningTime="2026-04-24 21:31:00.153866597 +0000 UTC m=+277.557129592" Apr 24 21:31:00.154022 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.153958 2575 scope.go:117] "RemoveContainer" containerID="452aac210787d9695b8fbb32a8d3cd3b4bb40faf18be24d4b09d755d8f31f150" Apr 24 21:31:00.162950 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.162917 2575 scope.go:117] "RemoveContainer" containerID="b49dee828f914c5c4dd4bd5241c80312f45a4b33b8265b619d155b8e32de150b" Apr 24 21:31:00.169379 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.169326 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:00.171142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.171120 2575 scope.go:117] "RemoveContainer" containerID="2008bebba1b96ab6c639f1cc5ea1dabfb5ff694b729a59c2bb17f8fc353be9ed" Apr 24 21:31:00.173175 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.173157 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:00.178782 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.178767 2575 scope.go:117] "RemoveContainer" containerID="022a61cc8f35e22e8ec058280bf4c8c1b1f8bdd42a82c22b3c9822dd6c73adac" Apr 24 21:31:00.199358 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.199329 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:00.204933 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.204910 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.207618 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.207592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 21:31:00.207732 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.207644 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 21:31:00.207830 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.207743 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-l6lqq\"" Apr 24 21:31:00.207957 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.207854 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 21:31:00.207957 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.207912 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 21:31:00.208167 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.207982 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 21:31:00.208167 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.208010 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 21:31:00.208167 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.208010 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 21:31:00.208573 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.208554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 21:31:00.213839 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.213823 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 21:31:00.219343 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.219324 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:00.244873 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.244840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-config-volume\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.244873 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.244871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245094 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.244892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245094 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.244925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/432ff1c4-1164-4013-8d70-0837a758df1e-config-out\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245094 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.244996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245094 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-web-config\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245248 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245248 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432ff1c4-1164-4013-8d70-0837a758df1e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245248 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/432ff1c4-1164-4013-8d70-0837a758df1e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2hn\" (UniqueName: \"kubernetes.io/projected/432ff1c4-1164-4013-8d70-0837a758df1e-kube-api-access-rn2hn\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/432ff1c4-1164-4013-8d70-0837a758df1e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.245490 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.245389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/432ff1c4-1164-4013-8d70-0837a758df1e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346582 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-config-volume\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346745 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346745 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346745 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/432ff1c4-1164-4013-8d70-0837a758df1e-config-out\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346745 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346950 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-web-config\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346950 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346950 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432ff1c4-1164-4013-8d70-0837a758df1e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.346950 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/432ff1c4-1164-4013-8d70-0837a758df1e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.347148 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.346963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2hn\" (UniqueName: \"kubernetes.io/projected/432ff1c4-1164-4013-8d70-0837a758df1e-kube-api-access-rn2hn\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.347148 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.347014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/432ff1c4-1164-4013-8d70-0837a758df1e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.347148 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.347051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.347148 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.347077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/432ff1c4-1164-4013-8d70-0837a758df1e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.348297 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.348273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432ff1c4-1164-4013-8d70-0837a758df1e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.349770 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.349746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-web-config\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.349907 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.349884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.349972 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.349903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.350024 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.349968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-config-volume\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.350024 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.349968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.350024 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.349985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/432ff1c4-1164-4013-8d70-0837a758df1e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.350179 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.350162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/432ff1c4-1164-4013-8d70-0837a758df1e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.350464 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.350442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/432ff1c4-1164-4013-8d70-0837a758df1e-config-out\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.350561 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.350539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/432ff1c4-1164-4013-8d70-0837a758df1e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.351103 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.351082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.351862 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.351842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/432ff1c4-1164-4013-8d70-0837a758df1e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.358040 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.358016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2hn\" (UniqueName: \"kubernetes.io/projected/432ff1c4-1164-4013-8d70-0837a758df1e-kube-api-access-rn2hn\") pod \"alertmanager-main-0\" (UID: \"432ff1c4-1164-4013-8d70-0837a758df1e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.514087 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.514016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 21:31:00.639803 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:00.639778 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 21:31:00.641467 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:31:00.641440 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432ff1c4_1164_4013_8d70_0837a758df1e.slice/crio-843183e405851355b1dbd17c5d73ae883928c13213f2ff1cc70239947bcf2aed WatchSource:0}: Error finding container 843183e405851355b1dbd17c5d73ae883928c13213f2ff1cc70239947bcf2aed: Status 404 returned error can't find the container with id 843183e405851355b1dbd17c5d73ae883928c13213f2ff1cc70239947bcf2aed Apr 24 21:31:01.138393 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:01.138355 2575 generic.go:358] "Generic (PLEG): container finished" podID="432ff1c4-1164-4013-8d70-0837a758df1e" containerID="0beadc20957862553508a3ea5c4a272a9a166e5ef4f4d6a28fd9ccac5c72113e" exitCode=0 Apr 24 21:31:01.138848 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:01.138443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerDied","Data":"0beadc20957862553508a3ea5c4a272a9a166e5ef4f4d6a28fd9ccac5c72113e"} Apr 24 21:31:01.138848 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:01.138482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"843183e405851355b1dbd17c5d73ae883928c13213f2ff1cc70239947bcf2aed"} Apr 24 21:31:01.219999 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:01.219891 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b2681e-5a49-49f4-8f25-35ce1354f6b5" path="/var/lib/kubelet/pods/73b2681e-5a49-49f4-8f25-35ce1354f6b5/volumes" Apr 24 21:31:01.692802 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:31:01.692711 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6hc5g" podUID="7eb5a6c5-93dc-4c4b-a6ab-b457966b4540" Apr 24 21:31:02.149234 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149193 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"694d1c3d2c0dbe460713664c229b4c0cf729533038ca9410cb73947903be0fcc"} Apr 24 21:31:02.149234 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:31:02.149234 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"43e5bf922e606d337dbd88f35e4ff9c5313cb54435cae0265add0b8e78caba48"} Apr 24 21:31:02.149727 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"64e3f3102cc8a6aa13fe8b11dfbd707d6cec8942451115e3b5b56d90385ce6d5"} Apr 24 21:31:02.149727 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"b74f52f82454c36e6a01ca909417eabd6394633c578266976731b1710adb16a0"} Apr 24 21:31:02.149727 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"35f9a791db0476e129cd11963a2a3ff3ebbed0d19d51aa6f095e0ee676dafffb"} Apr 24 21:31:02.149727 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.149317 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"432ff1c4-1164-4013-8d70-0837a758df1e","Type":"ContainerStarted","Data":"7aa90f1e69c899d5e729a7af31557888111e5caab90fa2ab4abe923d86fa47fd"} Apr 24 21:31:02.184035 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:02.183988 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.183971641 podStartE2EDuration="2.183971641s" podCreationTimestamp="2026-04-24 21:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:02.183342674 +0000 UTC m=+279.586605668" watchObservedRunningTime="2026-04-24 21:31:02.183971641 +0000 UTC m=+279.587234634" Apr 24 21:31:05.093190 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.093128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:31:05.093685 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.093243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:31:05.095733 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.095704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c652c05d-6547-4c43-a295-52b3275ef5e0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mdm8g\" (UID: \"c652c05d-6547-4c43-a295-52b3275ef5e0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:31:05.095856 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.095704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7eb5a6c5-93dc-4c4b-a6ab-b457966b4540-cert\") pod \"ingress-canary-6hc5g\" (UID: \"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540\") " pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:31:05.151884 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.151852 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv4ts\"" Apr 24 21:31:05.159876 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.159855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6hc5g" Apr 24 21:31:05.194047 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.194016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:31:05.196577 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.196554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33-metrics-tls\") pod \"dns-default-lrxx8\" (UID: \"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33\") " pod="openshift-dns/dns-default-lrxx8" Apr 24 21:31:05.218572 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.218369 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6p9hq\"" Apr 24 21:31:05.226665 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.226634 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" Apr 24 21:31:05.284299 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.284191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6hc5g"] Apr 24 21:31:05.288870 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:31:05.288836 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb5a6c5_93dc_4c4b_a6ab_b457966b4540.slice/crio-fb633f7f0011bbc1221c49dc828205494db7e181163d68626edb4cde37b3d626 WatchSource:0}: Error finding container fb633f7f0011bbc1221c49dc828205494db7e181163d68626edb4cde37b3d626: Status 404 returned error can't find the container with id fb633f7f0011bbc1221c49dc828205494db7e181163d68626edb4cde37b3d626 Apr 24 21:31:05.349087 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.349059 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g"] Apr 24 21:31:05.351489 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:31:05.351460 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc652c05d_6547_4c43_a295_52b3275ef5e0.slice/crio-3f94f46605b0b9c63e70dbf7351ae12885022523009ee0ca6377f77dde3678d7 WatchSource:0}: Error finding container 3f94f46605b0b9c63e70dbf7351ae12885022523009ee0ca6377f77dde3678d7: Status 404 returned error can't find the container with id 3f94f46605b0b9c63e70dbf7351ae12885022523009ee0ca6377f77dde3678d7 Apr 24 21:31:05.421098 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.421073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zkwd8\"" Apr 24 21:31:05.429027 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.429007 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lrxx8" Apr 24 21:31:05.547538 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:05.547509 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lrxx8"] Apr 24 21:31:05.550450 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:31:05.550426 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2b71f9_7480_4ae7_96bd_b2fa0ef3ae33.slice/crio-3a6c5dda9dd22d46c6eb7b6250cf2ca803958050aa581aad969daea29b53c453 WatchSource:0}: Error finding container 3a6c5dda9dd22d46c6eb7b6250cf2ca803958050aa581aad969daea29b53c453: Status 404 returned error can't find the container with id 3a6c5dda9dd22d46c6eb7b6250cf2ca803958050aa581aad969daea29b53c453 Apr 24 21:31:06.165571 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:06.165492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" event={"ID":"c652c05d-6547-4c43-a295-52b3275ef5e0","Type":"ContainerStarted","Data":"3f94f46605b0b9c63e70dbf7351ae12885022523009ee0ca6377f77dde3678d7"} Apr 24 21:31:06.167351 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:06.167322 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lrxx8" event={"ID":"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33","Type":"ContainerStarted","Data":"3a6c5dda9dd22d46c6eb7b6250cf2ca803958050aa581aad969daea29b53c453"} Apr 24 21:31:06.169179 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:06.169151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6hc5g" event={"ID":"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540","Type":"ContainerStarted","Data":"fb633f7f0011bbc1221c49dc828205494db7e181163d68626edb4cde37b3d626"} Apr 24 21:31:07.174992 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:07.174949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" event={"ID":"c652c05d-6547-4c43-a295-52b3275ef5e0","Type":"ContainerStarted","Data":"8419e0de78ed0771d447618aca86f788f811911e29fa4bd4eaa2837e946158c0"} Apr 24 21:31:08.179665 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:08.179629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lrxx8" event={"ID":"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33","Type":"ContainerStarted","Data":"220cedd27d98f1f6076e505a07d2336b7cc5989ba273c7c3e3e6d26ffad55c5a"} Apr 24 21:31:08.179665 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:08.179666 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lrxx8" event={"ID":"5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33","Type":"ContainerStarted","Data":"eafb8c8ab14443ef5625614036db9790557caf2470b0e042464f754f87844dd1"} Apr 24 21:31:08.180150 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:08.179765 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lrxx8" Apr 24 21:31:08.181023 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:08.180999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6hc5g" event={"ID":"7eb5a6c5-93dc-4c4b-a6ab-b457966b4540","Type":"ContainerStarted","Data":"6a9d14a7432ffc124d4a210dc842c39838813492e8e3b1c56f95cbe598b00e07"} Apr 24 21:31:08.209363 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:08.209314 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mdm8g" podStartSLOduration=275.120891778 podStartE2EDuration="4m36.209300594s" podCreationTimestamp="2026-04-24 21:26:32 +0000 UTC" firstStartedPulling="2026-04-24 21:31:05.355370739 +0000 UTC m=+282.758633713" lastFinishedPulling="2026-04-24 21:31:06.443779551 +0000 UTC m=+283.847042529" observedRunningTime="2026-04-24 21:31:07.201989653 +0000 UTC m=+284.605252646" watchObservedRunningTime="2026-04-24 21:31:08.209300594 +0000 UTC m=+285.612563606" Apr 24 21:31:08.210556 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:08.210522 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lrxx8" podStartSLOduration=252.321873628 podStartE2EDuration="4m14.210511869s" podCreationTimestamp="2026-04-24 21:26:54 +0000 UTC" firstStartedPulling="2026-04-24 21:31:05.552315252 +0000 UTC m=+282.955578227" lastFinishedPulling="2026-04-24 21:31:07.440953497 +0000 UTC m=+284.844216468" observedRunningTime="2026-04-24 21:31:08.209179179 +0000 UTC m=+285.612442171" watchObservedRunningTime="2026-04-24 21:31:08.210511869 +0000 UTC m=+285.613774861" Apr 24 21:31:09.878847 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:09.878809 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:31:09.879251 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:09.878880 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:31:09.883497 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:09.883474 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:31:09.905875 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:09.905829 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6hc5g" podStartSLOduration=253.757666366 podStartE2EDuration="4m15.905815738s" podCreationTimestamp="2026-04-24 21:26:54 +0000 UTC" firstStartedPulling="2026-04-24 21:31:05.29086916 +0000 UTC m=+282.694132134" lastFinishedPulling="2026-04-24 21:31:07.439018532 +0000 UTC m=+284.842281506" observedRunningTime="2026-04-24 21:31:08.257449853 +0000 UTC m=+285.660712846" watchObservedRunningTime="2026-04-24 21:31:09.905815738 +0000 UTC m=+287.309078792" Apr 24 21:31:10.191914 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:10.191884 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:31:10.259969 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:10.259939 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-547fdbb86f-s6v5z"] Apr 24 21:31:18.186785 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:18.186754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lrxx8" Apr 24 21:31:23.118095 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:23.118067 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:31:23.128768 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:23.128741 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:31:23.131524 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:23.131503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:31:23.133069 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:23.133038 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:31:23.138108 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:23.138094 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:31:35.279073 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.279018 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-547fdbb86f-s6v5z" podUID="3aef274d-51a0-4545-8b71-94a8ff952a0d" containerName="console" containerID="cri-o://4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372" gracePeriod=15 Apr 24 21:31:35.511891 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.511864 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547fdbb86f-s6v5z_3aef274d-51a0-4545-8b71-94a8ff952a0d/console/0.log" Apr 24 21:31:35.512007 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.511923 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:31:35.661880 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.661852 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-service-ca\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.661888 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-oauth-config\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.661929 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-trusted-ca-bundle\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.661959 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv4d8\" (UniqueName: \"kubernetes.io/projected/3aef274d-51a0-4545-8b71-94a8ff952a0d-kube-api-access-sv4d8\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662009 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-config\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662028 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-oauth-serving-cert\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662058 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662061 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-serving-cert\") pod \"3aef274d-51a0-4545-8b71-94a8ff952a0d\" (UID: \"3aef274d-51a0-4545-8b71-94a8ff952a0d\") " Apr 24 21:31:35.662459 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662402 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-config" (OuterVolumeSpecName: "console-config") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:35.662459 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662416 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:35.662693 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662589 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:35.662745 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.662685 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-service-ca" (OuterVolumeSpecName: "service-ca") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:35.664083 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.664059 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aef274d-51a0-4545-8b71-94a8ff952a0d-kube-api-access-sv4d8" (OuterVolumeSpecName: "kube-api-access-sv4d8") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "kube-api-access-sv4d8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:35.664083 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.664076 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:35.664225 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.664142 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3aef274d-51a0-4545-8b71-94a8ff952a0d" (UID: "3aef274d-51a0-4545-8b71-94a8ff952a0d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:35.763221 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763187 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-oauth-serving-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:35.763221 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763216 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-serving-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:35.763221 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763226 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-service-ca\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:35.763478 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763235 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-oauth-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:35.763478 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763244 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-trusted-ca-bundle\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:35.763478 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763283 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sv4d8\" (UniqueName: \"kubernetes.io/projected/3aef274d-51a0-4545-8b71-94a8ff952a0d-kube-api-access-sv4d8\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:35.763478 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:35.763293 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3aef274d-51a0-4545-8b71-94a8ff952a0d-console-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:31:36.264991 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.264963 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547fdbb86f-s6v5z_3aef274d-51a0-4545-8b71-94a8ff952a0d/console/0.log" Apr 24 21:31:36.265282 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.265002 2575 generic.go:358] "Generic (PLEG): container finished" podID="3aef274d-51a0-4545-8b71-94a8ff952a0d" containerID="4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372" exitCode=2 Apr 24 21:31:36.265282 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.265034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547fdbb86f-s6v5z" event={"ID":"3aef274d-51a0-4545-8b71-94a8ff952a0d","Type":"ContainerDied","Data":"4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372"} Apr 24 21:31:36.265282 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.265064 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547fdbb86f-s6v5z" Apr 24 21:31:36.265282 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.265077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547fdbb86f-s6v5z" event={"ID":"3aef274d-51a0-4545-8b71-94a8ff952a0d","Type":"ContainerDied","Data":"edc2567746881b2bdc9e671177666db2f35ec141b25585471329fc585d8889ee"} Apr 24 21:31:36.265282 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.265093 2575 scope.go:117] "RemoveContainer" containerID="4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372" Apr 24 21:31:36.273529 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.273513 2575 scope.go:117] "RemoveContainer" containerID="4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372" Apr 24 21:31:36.273808 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:31:36.273789 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372\": container with ID starting with 4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372 not found: ID does not exist" containerID="4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372" Apr 24 21:31:36.273858 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.273816 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372"} err="failed to get container status \"4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372\": rpc error: code = NotFound desc = could not find container \"4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372\": container with ID starting with 4a0a71be11401ae3f453d39d344a1549cf24f44b55cdec15427745f05ff57372 not found: ID does not exist" Apr 24 21:31:36.286699 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.286677 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-547fdbb86f-s6v5z"] Apr 24 21:31:36.290881 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:36.290857 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-547fdbb86f-s6v5z"] Apr 24 21:31:37.219710 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:37.219676 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aef274d-51a0-4545-8b71-94a8ff952a0d" path="/var/lib/kubelet/pods/3aef274d-51a0-4545-8b71-94a8ff952a0d/volumes" Apr 24 21:31:51.832217 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.832179 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc"] Apr 24 21:31:51.832619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.832571 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3aef274d-51a0-4545-8b71-94a8ff952a0d" containerName="console" Apr 24 21:31:51.832619 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.832583 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aef274d-51a0-4545-8b71-94a8ff952a0d" containerName="console" Apr 24 21:31:51.832704 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.832635 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3aef274d-51a0-4545-8b71-94a8ff952a0d" containerName="console" Apr 24 21:31:51.838318 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.838298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.840922 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.840900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:31:51.841700 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.841684 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:31:51.841789 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.841721 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zb5rh\"" Apr 24 21:31:51.845585 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.845559 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc"] Apr 24 21:31:51.883783 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.883747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l4sb\" (UniqueName: \"kubernetes.io/projected/f364a282-154c-48f7-9831-b67a661a8de3-kube-api-access-6l4sb\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.883943 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.883792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.883943 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.883908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.989729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.985456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l4sb\" (UniqueName: \"kubernetes.io/projected/f364a282-154c-48f7-9831-b67a661a8de3-kube-api-access-6l4sb\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.989729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.985539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.989729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.985593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.989729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.986073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.989729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.986768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:51.995601 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:51.995579 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l4sb\" (UniqueName: \"kubernetes.io/projected/f364a282-154c-48f7-9831-b67a661a8de3-kube-api-access-6l4sb\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:52.148741 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:52.148706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:31:52.269486 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:52.269455 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc"] Apr 24 21:31:52.272073 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:31:52.272046 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf364a282_154c_48f7_9831_b67a661a8de3.slice/crio-d96bb42f27ee7b90c332621b64620f2cee8140c18391f841204b86c40a656c72 WatchSource:0}: Error finding container d96bb42f27ee7b90c332621b64620f2cee8140c18391f841204b86c40a656c72: Status 404 returned error can't find the container with id d96bb42f27ee7b90c332621b64620f2cee8140c18391f841204b86c40a656c72 Apr 24 21:31:52.273974 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:52.273959 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:31:52.313947 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:52.313909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" event={"ID":"f364a282-154c-48f7-9831-b67a661a8de3","Type":"ContainerStarted","Data":"d96bb42f27ee7b90c332621b64620f2cee8140c18391f841204b86c40a656c72"} Apr 24 21:31:58.335549 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:58.335514 2575 generic.go:358] "Generic (PLEG): container finished" podID="f364a282-154c-48f7-9831-b67a661a8de3" containerID="e5e81a56d6d54464d393dfdebafbf7672181dd6254f34ea007b7d309021e6831" exitCode=0 Apr 24 21:31:58.335986 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:31:58.335594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" event={"ID":"f364a282-154c-48f7-9831-b67a661a8de3","Type":"ContainerDied","Data":"e5e81a56d6d54464d393dfdebafbf7672181dd6254f34ea007b7d309021e6831"} Apr 24 21:32:00.344712 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:00.344675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" event={"ID":"f364a282-154c-48f7-9831-b67a661a8de3","Type":"ContainerStarted","Data":"feab5b8a8db234cc1ea1abc88241e190949a420223eb77de5136709572d6774f"} Apr 24 21:32:01.349186 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:01.349149 2575 generic.go:358] "Generic (PLEG): container finished" podID="f364a282-154c-48f7-9831-b67a661a8de3" containerID="feab5b8a8db234cc1ea1abc88241e190949a420223eb77de5136709572d6774f" exitCode=0 Apr 24 21:32:01.349608 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:01.349231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" event={"ID":"f364a282-154c-48f7-9831-b67a661a8de3","Type":"ContainerDied","Data":"feab5b8a8db234cc1ea1abc88241e190949a420223eb77de5136709572d6774f"} Apr 24 21:32:08.374181 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:08.374139 2575 generic.go:358] "Generic (PLEG): container finished" podID="f364a282-154c-48f7-9831-b67a661a8de3" containerID="801f5579982bfd516fa045cedbfc7b510ddf8a8e37a88b16c7872c3c2224ee47" exitCode=0 Apr 24 21:32:08.374596 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:08.374224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" event={"ID":"f364a282-154c-48f7-9831-b67a661a8de3","Type":"ContainerDied","Data":"801f5579982bfd516fa045cedbfc7b510ddf8a8e37a88b16c7872c3c2224ee47"} Apr 24 21:32:09.504954 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.504931 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:32:09.661767 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.661739 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l4sb\" (UniqueName: \"kubernetes.io/projected/f364a282-154c-48f7-9831-b67a661a8de3-kube-api-access-6l4sb\") pod \"f364a282-154c-48f7-9831-b67a661a8de3\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " Apr 24 21:32:09.661932 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.661791 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-util\") pod \"f364a282-154c-48f7-9831-b67a661a8de3\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " Apr 24 21:32:09.661932 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.661829 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-bundle\") pod \"f364a282-154c-48f7-9831-b67a661a8de3\" (UID: \"f364a282-154c-48f7-9831-b67a661a8de3\") " Apr 24 21:32:09.662450 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.662423 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-bundle" (OuterVolumeSpecName: "bundle") pod "f364a282-154c-48f7-9831-b67a661a8de3" (UID: "f364a282-154c-48f7-9831-b67a661a8de3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:09.663965 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.663935 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f364a282-154c-48f7-9831-b67a661a8de3-kube-api-access-6l4sb" (OuterVolumeSpecName: "kube-api-access-6l4sb") pod "f364a282-154c-48f7-9831-b67a661a8de3" (UID: "f364a282-154c-48f7-9831-b67a661a8de3"). InnerVolumeSpecName "kube-api-access-6l4sb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:09.666855 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.666821 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-util" (OuterVolumeSpecName: "util") pod "f364a282-154c-48f7-9831-b67a661a8de3" (UID: "f364a282-154c-48f7-9831-b67a661a8de3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:09.762352 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.762324 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-util\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.762352 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.762348 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f364a282-154c-48f7-9831-b67a661a8de3-bundle\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.762352 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:09.762358 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6l4sb\" (UniqueName: \"kubernetes.io/projected/f364a282-154c-48f7-9831-b67a661a8de3-kube-api-access-6l4sb\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:32:10.381954 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:10.381917 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" event={"ID":"f364a282-154c-48f7-9831-b67a661a8de3","Type":"ContainerDied","Data":"d96bb42f27ee7b90c332621b64620f2cee8140c18391f841204b86c40a656c72"} Apr 24 21:32:10.381954 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:10.381952 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d96bb42f27ee7b90c332621b64620f2cee8140c18391f841204b86c40a656c72" Apr 24 21:32:10.382153 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:10.381973 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cf4smc" Apr 24 21:32:13.927962 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.927922 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv"] Apr 24 21:32:13.928456 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928440 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="extract" Apr 24 21:32:13.928524 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928460 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="extract" Apr 24 21:32:13.928524 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928510 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="util" Apr 24 21:32:13.928524 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928519 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="util" Apr 24 21:32:13.928668 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928530 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="pull" Apr 24 21:32:13.928668 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928540 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="pull" Apr 24 21:32:13.928668 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.928621 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f364a282-154c-48f7-9831-b67a661a8de3" containerName="extract" Apr 24 21:32:13.936170 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.936140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:13.938733 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.938702 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:32:13.939006 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.938982 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:32:13.939711 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.939692 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:32:13.939973 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.939726 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-2v7xm\"" Apr 24 21:32:13.940959 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:13.940938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv"] Apr 24 21:32:14.101498 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.101458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/80c1af42-df31-43c1-9a3b-a6c4b868b55c-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv\" (UID: \"80c1af42-df31-43c1-9a3b-a6c4b868b55c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.101676 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.101507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4w2\" (UniqueName: \"kubernetes.io/projected/80c1af42-df31-43c1-9a3b-a6c4b868b55c-kube-api-access-vd4w2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv\" (UID: \"80c1af42-df31-43c1-9a3b-a6c4b868b55c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.202442 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.202365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/80c1af42-df31-43c1-9a3b-a6c4b868b55c-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv\" (UID: \"80c1af42-df31-43c1-9a3b-a6c4b868b55c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.202442 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.202408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4w2\" (UniqueName: \"kubernetes.io/projected/80c1af42-df31-43c1-9a3b-a6c4b868b55c-kube-api-access-vd4w2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv\" (UID: \"80c1af42-df31-43c1-9a3b-a6c4b868b55c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.204696 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.204676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/80c1af42-df31-43c1-9a3b-a6c4b868b55c-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv\" (UID: \"80c1af42-df31-43c1-9a3b-a6c4b868b55c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.211958 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.211933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4w2\" (UniqueName: \"kubernetes.io/projected/80c1af42-df31-43c1-9a3b-a6c4b868b55c-kube-api-access-vd4w2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv\" (UID: \"80c1af42-df31-43c1-9a3b-a6c4b868b55c\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.247558 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.247531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:14.373438 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.373384 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv"] Apr 24 21:32:14.375633 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:32:14.375606 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c1af42_df31_43c1_9a3b_a6c4b868b55c.slice/crio-5637bed75483d80ba64a8e854ab2ace5ace1a397717ae248fbe00eb4fae1d621 WatchSource:0}: Error finding container 5637bed75483d80ba64a8e854ab2ace5ace1a397717ae248fbe00eb4fae1d621: Status 404 returned error can't find the container with id 5637bed75483d80ba64a8e854ab2ace5ace1a397717ae248fbe00eb4fae1d621 Apr 24 21:32:14.394982 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:14.394957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" event={"ID":"80c1af42-df31-43c1-9a3b-a6c4b868b55c","Type":"ContainerStarted","Data":"5637bed75483d80ba64a8e854ab2ace5ace1a397717ae248fbe00eb4fae1d621"} Apr 24 21:32:18.212664 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.212626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-bf458"] Apr 24 21:32:18.216083 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.216065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.222501 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.222422 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:32:18.222501 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.222427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:32:18.222899 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.222877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5gf67\"" Apr 24 21:32:18.253155 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.253127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-bf458"] Apr 24 21:32:18.338870 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.338838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e957ad3a-5bf6-4878-9f46-e8803381fd23-cabundle0\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.339023 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.338880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.339023 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.338957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76n2h\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-kube-api-access-76n2h\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.410347 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.410310 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" event={"ID":"80c1af42-df31-43c1-9a3b-a6c4b868b55c","Type":"ContainerStarted","Data":"4db88f2092fbdd406088997170c5866e4dcd4bb1085dfa432b87cbb0b69e284a"} Apr 24 21:32:18.410524 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.410431 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:18.439941 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.439913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e957ad3a-5bf6-4878-9f46-e8803381fd23-cabundle0\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.440111 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.439965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.440111 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.440051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76n2h\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-kube-api-access-76n2h\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.440226 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.440120 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:18.440226 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.440140 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:18.440226 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.440152 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bf458: references non-existent secret key: ca.crt Apr 24 21:32:18.440431 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.440228 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates podName:e957ad3a-5bf6-4878-9f46-e8803381fd23 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:18.940207505 +0000 UTC m=+356.343470492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates") pod "keda-operator-ffbb595cb-bf458" (UID: "e957ad3a-5bf6-4878-9f46-e8803381fd23") : references non-existent secret key: ca.crt Apr 24 21:32:18.440592 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.440573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e957ad3a-5bf6-4878-9f46-e8803381fd23-cabundle0\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.466422 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.466355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76n2h\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-kube-api-access-76n2h\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.718499 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.718384 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" podStartSLOduration=2.557096757 podStartE2EDuration="5.718367577s" podCreationTimestamp="2026-04-24 21:32:13 +0000 UTC" firstStartedPulling="2026-04-24 21:32:14.377318825 +0000 UTC m=+351.780581800" lastFinishedPulling="2026-04-24 21:32:17.538589649 +0000 UTC m=+354.941852620" observedRunningTime="2026-04-24 21:32:18.467447857 +0000 UTC m=+355.870710851" watchObservedRunningTime="2026-04-24 21:32:18.718367577 +0000 UTC m=+356.121630569" Apr 24 21:32:18.719449 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.719428 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r"] Apr 24 21:32:18.722905 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.722890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.727215 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.727194 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:32:18.736946 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.736925 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r"] Apr 24 21:32:18.843054 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.843023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/331d6a8d-f57a-443f-a300-ed1bf07673ca-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.843253 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.843061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csx62\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-kube-api-access-csx62\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.843253 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.843092 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.940376 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.940342 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-k22rj"] Apr 24 21:32:18.943613 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.943597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:18.943771 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.943750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.943873 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.943861 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:18.943912 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.943875 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:18.943912 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.943875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:18.943912 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.943891 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r: references non-existent secret key: tls.crt Apr 24 21:32:18.944040 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.943929 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates podName:331d6a8d-f57a-443f-a300-ed1bf07673ca nodeName:}" failed. No retries permitted until 2026-04-24 21:32:19.443915674 +0000 UTC m=+356.847178645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates") pod "keda-metrics-apiserver-7c9f485588-6t94r" (UID: "331d6a8d-f57a-443f-a300-ed1bf07673ca") : references non-existent secret key: tls.crt Apr 24 21:32:18.944040 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.943960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/331d6a8d-f57a-443f-a300-ed1bf07673ca-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.944040 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.943981 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:18.944040 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.943987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csx62\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-kube-api-access-csx62\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.944040 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.943995 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:18.944040 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.944006 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bf458: references non-existent secret key: ca.crt Apr 24 21:32:18.944297 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:18.944048 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates podName:e957ad3a-5bf6-4878-9f46-e8803381fd23 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:19.94403346 +0000 UTC m=+357.347296451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates") pod "keda-operator-ffbb595cb-bf458" (UID: "e957ad3a-5bf6-4878-9f46-e8803381fd23") : references non-existent secret key: ca.crt Apr 24 21:32:18.944339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.944306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/331d6a8d-f57a-443f-a300-ed1bf07673ca-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:18.947053 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.947034 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:32:18.959677 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.959651 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-k22rj"] Apr 24 21:32:18.967297 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:18.967274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csx62\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-kube-api-access-csx62\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:19.044798 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.044727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfndn\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-kube-api-access-wfndn\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.044798 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.044782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-certificates\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.146012 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.145977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfndn\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-kube-api-access-wfndn\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.146181 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.146038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-certificates\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.146181 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.146155 2575 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 21:32:19.146325 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.146184 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-k22rj: secret "keda-admission-webhooks-certs" not found Apr 24 21:32:19.146325 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.146250 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-certificates podName:18348a01-4068-4399-ad3f-1e824ad02fec nodeName:}" failed. No retries permitted until 2026-04-24 21:32:19.646229693 +0000 UTC m=+357.049492685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-certificates") pod "keda-admission-cf49989db-k22rj" (UID: "18348a01-4068-4399-ad3f-1e824ad02fec") : secret "keda-admission-webhooks-certs" not found Apr 24 21:32:19.156414 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.156383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfndn\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-kube-api-access-wfndn\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.449326 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.449283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:19.449808 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.449432 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:19.449808 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.449456 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:19.449808 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.449483 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r: references non-existent secret key: tls.crt Apr 24 21:32:19.449808 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.449550 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates podName:331d6a8d-f57a-443f-a300-ed1bf07673ca nodeName:}" failed. No retries permitted until 2026-04-24 21:32:20.449530395 +0000 UTC m=+357.852793377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates") pod "keda-metrics-apiserver-7c9f485588-6t94r" (UID: "331d6a8d-f57a-443f-a300-ed1bf07673ca") : references non-existent secret key: tls.crt Apr 24 21:32:19.653452 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.653405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-certificates\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.656093 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.656067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/18348a01-4068-4399-ad3f-1e824ad02fec-certificates\") pod \"keda-admission-cf49989db-k22rj\" (UID: \"18348a01-4068-4399-ad3f-1e824ad02fec\") " pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.854005 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.853914 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:19.957174 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.956672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:19.957174 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.956799 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:19.957174 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.956811 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:19.957174 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.956819 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bf458: references non-existent secret key: ca.crt Apr 24 21:32:19.957174 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:19.956871 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates podName:e957ad3a-5bf6-4878-9f46-e8803381fd23 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:21.956856374 +0000 UTC m=+359.360119346 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates") pod "keda-operator-ffbb595cb-bf458" (UID: "e957ad3a-5bf6-4878-9f46-e8803381fd23") : references non-existent secret key: ca.crt Apr 24 21:32:19.991634 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:19.991575 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-k22rj"] Apr 24 21:32:19.994143 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:32:19.994099 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18348a01_4068_4399_ad3f_1e824ad02fec.slice/crio-9ed684213431229aaa3a35dbae97c16129912582ce03f3ae1a12e2e630db4021 WatchSource:0}: Error finding container 9ed684213431229aaa3a35dbae97c16129912582ce03f3ae1a12e2e630db4021: Status 404 returned error can't find the container with id 9ed684213431229aaa3a35dbae97c16129912582ce03f3ae1a12e2e630db4021 Apr 24 21:32:20.418427 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:20.418394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-k22rj" event={"ID":"18348a01-4068-4399-ad3f-1e824ad02fec","Type":"ContainerStarted","Data":"9ed684213431229aaa3a35dbae97c16129912582ce03f3ae1a12e2e630db4021"} Apr 24 21:32:20.461020 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:20.460987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:20.461491 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:20.461139 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:20.461491 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:20.461164 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:20.461491 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:20.461187 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r: references non-existent secret key: tls.crt Apr 24 21:32:20.461491 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:20.461252 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates podName:331d6a8d-f57a-443f-a300-ed1bf07673ca nodeName:}" failed. No retries permitted until 2026-04-24 21:32:22.46123155 +0000 UTC m=+359.864494521 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates") pod "keda-metrics-apiserver-7c9f485588-6t94r" (UID: "331d6a8d-f57a-443f-a300-ed1bf07673ca") : references non-existent secret key: tls.crt Apr 24 21:32:21.975957 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:21.975921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:21.976341 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:21.976047 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:32:21.976341 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:21.976060 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:32:21.976341 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:21.976070 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bf458: references non-existent secret key: ca.crt Apr 24 21:32:21.976341 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:21.976116 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates podName:e957ad3a-5bf6-4878-9f46-e8803381fd23 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:25.97610318 +0000 UTC m=+363.379366151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates") pod "keda-operator-ffbb595cb-bf458" (UID: "e957ad3a-5bf6-4878-9f46-e8803381fd23") : references non-existent secret key: ca.crt Apr 24 21:32:22.426221 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:22.426188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-k22rj" event={"ID":"18348a01-4068-4399-ad3f-1e824ad02fec","Type":"ContainerStarted","Data":"fea97e53eb288b70cafd990f0d6c01ca8a13fadbef57da2acd992ef3449df91c"} Apr 24 21:32:22.426412 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:22.426242 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:22.450143 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:22.450050 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-k22rj" podStartSLOduration=2.950635802 podStartE2EDuration="4.450035138s" podCreationTimestamp="2026-04-24 21:32:18 +0000 UTC" firstStartedPulling="2026-04-24 21:32:19.995626803 +0000 UTC m=+357.398889776" lastFinishedPulling="2026-04-24 21:32:21.495026136 +0000 UTC m=+358.898289112" observedRunningTime="2026-04-24 21:32:22.44939358 +0000 UTC m=+359.852656572" watchObservedRunningTime="2026-04-24 21:32:22.450035138 +0000 UTC m=+359.853298132" Apr 24 21:32:22.480360 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:22.480334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:22.480491 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:22.480474 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:32:22.480528 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:22.480494 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:32:22.480528 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:22.480512 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r: references non-existent secret key: tls.crt Apr 24 21:32:22.480596 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:32:22.480559 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates podName:331d6a8d-f57a-443f-a300-ed1bf07673ca nodeName:}" failed. No retries permitted until 2026-04-24 21:32:26.480544706 +0000 UTC m=+363.883807680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates") pod "keda-metrics-apiserver-7c9f485588-6t94r" (UID: "331d6a8d-f57a-443f-a300-ed1bf07673ca") : references non-existent secret key: tls.crt Apr 24 21:32:26.011118 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.011080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:26.013431 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.013409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e957ad3a-5bf6-4878-9f46-e8803381fd23-certificates\") pod \"keda-operator-ffbb595cb-bf458\" (UID: \"e957ad3a-5bf6-4878-9f46-e8803381fd23\") " pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:26.026444 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.026413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:26.155722 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.155699 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-bf458"] Apr 24 21:32:26.157606 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:32:26.157577 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode957ad3a_5bf6_4878_9f46_e8803381fd23.slice/crio-140edf3a6055b7e87215f856b9902e1caff89fe7a2e21f2229e030f2e666e027 WatchSource:0}: Error finding container 140edf3a6055b7e87215f856b9902e1caff89fe7a2e21f2229e030f2e666e027: Status 404 returned error can't find the container with id 140edf3a6055b7e87215f856b9902e1caff89fe7a2e21f2229e030f2e666e027 Apr 24 21:32:26.440599 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.440562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-bf458" event={"ID":"e957ad3a-5bf6-4878-9f46-e8803381fd23","Type":"ContainerStarted","Data":"140edf3a6055b7e87215f856b9902e1caff89fe7a2e21f2229e030f2e666e027"} Apr 24 21:32:26.514874 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.514835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:26.517283 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.517237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/331d6a8d-f57a-443f-a300-ed1bf07673ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6t94r\" (UID: \"331d6a8d-f57a-443f-a300-ed1bf07673ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:26.533174 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.533147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:26.669725 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:26.669690 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r"] Apr 24 21:32:26.674139 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:32:26.674113 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331d6a8d_f57a_443f_a300_ed1bf07673ca.slice/crio-7147f16322dc72095a82b0d85fd9a328925b023cc4afc408f6c0e268cc65ff1a WatchSource:0}: Error finding container 7147f16322dc72095a82b0d85fd9a328925b023cc4afc408f6c0e268cc65ff1a: Status 404 returned error can't find the container with id 7147f16322dc72095a82b0d85fd9a328925b023cc4afc408f6c0e268cc65ff1a Apr 24 21:32:27.445566 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:27.445525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" event={"ID":"331d6a8d-f57a-443f-a300-ed1bf07673ca","Type":"ContainerStarted","Data":"7147f16322dc72095a82b0d85fd9a328925b023cc4afc408f6c0e268cc65ff1a"} Apr 24 21:32:30.463133 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:30.463030 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-bf458" event={"ID":"e957ad3a-5bf6-4878-9f46-e8803381fd23","Type":"ContainerStarted","Data":"a055f4c26f76f4ac8d449dc8a0a627d6b83a863dfba38c45aa4efa0fcc3806c4"} Apr 24 21:32:30.463640 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:30.463182 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:32:30.464483 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:30.464460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" event={"ID":"331d6a8d-f57a-443f-a300-ed1bf07673ca","Type":"ContainerStarted","Data":"8d1fdc06ba50e8c35bc1993f82f6c671ebddd0b7a568d9022974dbbc313125bb"} Apr 24 21:32:30.464642 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:30.464630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:30.500610 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:30.500565 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-bf458" podStartSLOduration=8.461371825 podStartE2EDuration="12.500549868s" podCreationTimestamp="2026-04-24 21:32:18 +0000 UTC" firstStartedPulling="2026-04-24 21:32:26.158899869 +0000 UTC m=+363.562162843" lastFinishedPulling="2026-04-24 21:32:30.198077915 +0000 UTC m=+367.601340886" observedRunningTime="2026-04-24 21:32:30.498044036 +0000 UTC m=+367.901307029" watchObservedRunningTime="2026-04-24 21:32:30.500549868 +0000 UTC m=+367.903812862" Apr 24 21:32:30.523918 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:30.523875 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" podStartSLOduration=9.001240445 podStartE2EDuration="12.523861951s" podCreationTimestamp="2026-04-24 21:32:18 +0000 UTC" firstStartedPulling="2026-04-24 21:32:26.675555891 +0000 UTC m=+364.078818868" lastFinishedPulling="2026-04-24 21:32:30.198177403 +0000 UTC m=+367.601440374" observedRunningTime="2026-04-24 21:32:30.521528177 +0000 UTC m=+367.924791183" watchObservedRunningTime="2026-04-24 21:32:30.523861951 +0000 UTC m=+367.927124944" Apr 24 21:32:39.416403 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:39.416368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-m5tqv" Apr 24 21:32:41.472065 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:41.472034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6t94r" Apr 24 21:32:43.431799 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:43.431769 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-k22rj" Apr 24 21:32:51.470103 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:32:51.470074 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-bf458" Apr 24 21:33:31.670809 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.670773 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-56828"] Apr 24 21:33:31.674287 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.674248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.678226 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.677834 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:33:31.678226 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.678071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wfmwj\"" Apr 24 21:33:31.678559 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.678541 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:33:31.678830 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.678785 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:33:31.679325 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.679306 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g"] Apr 24 21:33:31.683142 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.683124 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.683355 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.683337 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-56828"] Apr 24 21:33:31.686960 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.686945 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-w9rzl\"" Apr 24 21:33:31.687122 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.687104 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:33:31.694146 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.694126 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g"] Apr 24 21:33:31.771381 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.771353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a90a80-d6a0-4e65-a801-9a717bc66500-cert\") pod \"kserve-controller-manager-67f77cd7d7-56828\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.771546 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.771395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmltc\" (UniqueName: \"kubernetes.io/projected/58a90a80-d6a0-4e65-a801-9a717bc66500-kube-api-access-gmltc\") pod \"kserve-controller-manager-67f77cd7d7-56828\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.771546 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.771498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7418663-d570-4a1d-8538-7f12b6945e7a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-c4r4g\" (UID: \"b7418663-d570-4a1d-8538-7f12b6945e7a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.771546 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.771521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fxn\" (UniqueName: \"kubernetes.io/projected/b7418663-d570-4a1d-8538-7f12b6945e7a-kube-api-access-r2fxn\") pod \"llmisvc-controller-manager-68cc5db7c4-c4r4g\" (UID: \"b7418663-d570-4a1d-8538-7f12b6945e7a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.872239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.872201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7418663-d570-4a1d-8538-7f12b6945e7a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-c4r4g\" (UID: \"b7418663-d570-4a1d-8538-7f12b6945e7a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.872239 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.872238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fxn\" (UniqueName: \"kubernetes.io/projected/b7418663-d570-4a1d-8538-7f12b6945e7a-kube-api-access-r2fxn\") pod \"llmisvc-controller-manager-68cc5db7c4-c4r4g\" (UID: \"b7418663-d570-4a1d-8538-7f12b6945e7a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.872466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.872297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a90a80-d6a0-4e65-a801-9a717bc66500-cert\") pod \"kserve-controller-manager-67f77cd7d7-56828\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.872466 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.872321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmltc\" (UniqueName: \"kubernetes.io/projected/58a90a80-d6a0-4e65-a801-9a717bc66500-kube-api-access-gmltc\") pod \"kserve-controller-manager-67f77cd7d7-56828\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.874822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.874799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a90a80-d6a0-4e65-a801-9a717bc66500-cert\") pod \"kserve-controller-manager-67f77cd7d7-56828\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.874822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.874816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7418663-d570-4a1d-8538-7f12b6945e7a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-c4r4g\" (UID: \"b7418663-d570-4a1d-8538-7f12b6945e7a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.881399 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.881379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmltc\" (UniqueName: \"kubernetes.io/projected/58a90a80-d6a0-4e65-a801-9a717bc66500-kube-api-access-gmltc\") pod \"kserve-controller-manager-67f77cd7d7-56828\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.881491 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.881404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fxn\" (UniqueName: \"kubernetes.io/projected/b7418663-d570-4a1d-8538-7f12b6945e7a-kube-api-access-r2fxn\") pod \"llmisvc-controller-manager-68cc5db7c4-c4r4g\" (UID: \"b7418663-d570-4a1d-8538-7f12b6945e7a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:31.986404 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.986338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:31.996241 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:31.996212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:32.119582 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:32.119552 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-56828"] Apr 24 21:33:32.121546 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:33:32.121519 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58a90a80_d6a0_4e65_a801_9a717bc66500.slice/crio-aff9a2dc7b4866c0699796ffe3d105199465cb41f70b7272932c5382e4f78c20 WatchSource:0}: Error finding container aff9a2dc7b4866c0699796ffe3d105199465cb41f70b7272932c5382e4f78c20: Status 404 returned error can't find the container with id aff9a2dc7b4866c0699796ffe3d105199465cb41f70b7272932c5382e4f78c20 Apr 24 21:33:32.144945 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:32.144922 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g"] Apr 24 21:33:32.147952 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:33:32.147911 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb7418663_d570_4a1d_8538_7f12b6945e7a.slice/crio-297a9c877dd71a3ccebadfcfb09189f98533f74691d4a6c0194fd066b2cf716e WatchSource:0}: Error finding container 297a9c877dd71a3ccebadfcfb09189f98533f74691d4a6c0194fd066b2cf716e: Status 404 returned error can't find the container with id 297a9c877dd71a3ccebadfcfb09189f98533f74691d4a6c0194fd066b2cf716e Apr 24 21:33:32.657462 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:32.657421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" event={"ID":"b7418663-d570-4a1d-8538-7f12b6945e7a","Type":"ContainerStarted","Data":"297a9c877dd71a3ccebadfcfb09189f98533f74691d4a6c0194fd066b2cf716e"} Apr 24 21:33:32.658732 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:32.658703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" event={"ID":"58a90a80-d6a0-4e65-a801-9a717bc66500","Type":"ContainerStarted","Data":"aff9a2dc7b4866c0699796ffe3d105199465cb41f70b7272932c5382e4f78c20"} Apr 24 21:33:35.673102 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:35.673064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" event={"ID":"b7418663-d570-4a1d-8538-7f12b6945e7a","Type":"ContainerStarted","Data":"7160f162226b4c492238a39bd009711ede9f10e6a0a30b9a428c5250c68fc0c7"} Apr 24 21:33:35.673574 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:35.673208 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:33:35.674477 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:35.674458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" event={"ID":"58a90a80-d6a0-4e65-a801-9a717bc66500","Type":"ContainerStarted","Data":"2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f"} Apr 24 21:33:35.674573 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:35.674562 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:33:35.689536 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:35.689493 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" podStartSLOduration=1.4765918789999999 podStartE2EDuration="4.689482123s" podCreationTimestamp="2026-04-24 21:33:31 +0000 UTC" firstStartedPulling="2026-04-24 21:33:32.149207672 +0000 UTC m=+429.552470646" lastFinishedPulling="2026-04-24 21:33:35.36209792 +0000 UTC m=+432.765360890" observedRunningTime="2026-04-24 21:33:35.688491002 +0000 UTC m=+433.091753996" watchObservedRunningTime="2026-04-24 21:33:35.689482123 +0000 UTC m=+433.092745115" Apr 24 21:33:35.704484 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:33:35.703993 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" podStartSLOduration=1.465883378 podStartE2EDuration="4.70397551s" podCreationTimestamp="2026-04-24 21:33:31 +0000 UTC" firstStartedPulling="2026-04-24 21:33:32.123483418 +0000 UTC m=+429.526746395" lastFinishedPulling="2026-04-24 21:33:35.361575555 +0000 UTC m=+432.764838527" observedRunningTime="2026-04-24 21:33:35.70267572 +0000 UTC m=+433.105938736" watchObservedRunningTime="2026-04-24 21:33:35.70397551 +0000 UTC m=+433.107238504" Apr 24 21:34:06.681027 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:06.680995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-c4r4g" Apr 24 21:34:06.683997 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:06.683977 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:34:08.473564 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.473526 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-56828"] Apr 24 21:34:08.473947 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.473750 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" podUID="58a90a80-d6a0-4e65-a801-9a717bc66500" containerName="manager" containerID="cri-o://2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f" gracePeriod=10 Apr 24 21:34:08.541411 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.541377 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-s8kcz"] Apr 24 21:34:08.566208 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.566182 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-s8kcz"] Apr 24 21:34:08.566367 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.566306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.682819 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.682792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a9c569a-092b-42d5-ab79-cc933014c58d-cert\") pod \"kserve-controller-manager-67f77cd7d7-s8kcz\" (UID: \"8a9c569a-092b-42d5-ab79-cc933014c58d\") " pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.682945 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.682840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwfzb\" (UniqueName: \"kubernetes.io/projected/8a9c569a-092b-42d5-ab79-cc933014c58d-kube-api-access-kwfzb\") pod \"kserve-controller-manager-67f77cd7d7-s8kcz\" (UID: \"8a9c569a-092b-42d5-ab79-cc933014c58d\") " pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.739559 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.739538 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:34:08.783537 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.783505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwfzb\" (UniqueName: \"kubernetes.io/projected/8a9c569a-092b-42d5-ab79-cc933014c58d-kube-api-access-kwfzb\") pod \"kserve-controller-manager-67f77cd7d7-s8kcz\" (UID: \"8a9c569a-092b-42d5-ab79-cc933014c58d\") " pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.783754 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.783651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a9c569a-092b-42d5-ab79-cc933014c58d-cert\") pod \"kserve-controller-manager-67f77cd7d7-s8kcz\" (UID: \"8a9c569a-092b-42d5-ab79-cc933014c58d\") " pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.784385 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.784356 2575 generic.go:358] "Generic (PLEG): container finished" podID="58a90a80-d6a0-4e65-a801-9a717bc66500" containerID="2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f" exitCode=0 Apr 24 21:34:08.784495 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.784430 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" Apr 24 21:34:08.784495 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.784440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" event={"ID":"58a90a80-d6a0-4e65-a801-9a717bc66500","Type":"ContainerDied","Data":"2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f"} Apr 24 21:34:08.784495 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.784486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-56828" event={"ID":"58a90a80-d6a0-4e65-a801-9a717bc66500","Type":"ContainerDied","Data":"aff9a2dc7b4866c0699796ffe3d105199465cb41f70b7272932c5382e4f78c20"} Apr 24 21:34:08.784664 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.784508 2575 scope.go:117] "RemoveContainer" containerID="2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f" Apr 24 21:34:08.786352 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.786329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a9c569a-092b-42d5-ab79-cc933014c58d-cert\") pod \"kserve-controller-manager-67f77cd7d7-s8kcz\" (UID: \"8a9c569a-092b-42d5-ab79-cc933014c58d\") " pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.799645 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.799622 2575 scope.go:117] "RemoveContainer" containerID="2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f" Apr 24 21:34:08.799960 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:34:08.799936 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f\": container with ID starting with 2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f not found: ID does not exist" containerID="2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f" Apr 24 21:34:08.800033 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.799973 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f"} err="failed to get container status \"2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f\": rpc error: code = NotFound desc = could not find container \"2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f\": container with ID starting with 2cf28f9e796c72eb06c842614d0b73224f2e19fd0b08181679907969d3e9b01f not found: ID does not exist" Apr 24 21:34:08.801027 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.801003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwfzb\" (UniqueName: \"kubernetes.io/projected/8a9c569a-092b-42d5-ab79-cc933014c58d-kube-api-access-kwfzb\") pod \"kserve-controller-manager-67f77cd7d7-s8kcz\" (UID: \"8a9c569a-092b-42d5-ab79-cc933014c58d\") " pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.875879 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.875842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:08.884953 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.884777 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmltc\" (UniqueName: \"kubernetes.io/projected/58a90a80-d6a0-4e65-a801-9a717bc66500-kube-api-access-gmltc\") pod \"58a90a80-d6a0-4e65-a801-9a717bc66500\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " Apr 24 21:34:08.884953 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.884807 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a90a80-d6a0-4e65-a801-9a717bc66500-cert\") pod \"58a90a80-d6a0-4e65-a801-9a717bc66500\" (UID: \"58a90a80-d6a0-4e65-a801-9a717bc66500\") " Apr 24 21:34:08.886820 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.886791 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a90a80-d6a0-4e65-a801-9a717bc66500-cert" (OuterVolumeSpecName: "cert") pod "58a90a80-d6a0-4e65-a801-9a717bc66500" (UID: "58a90a80-d6a0-4e65-a801-9a717bc66500"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:08.886904 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.886853 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a90a80-d6a0-4e65-a801-9a717bc66500-kube-api-access-gmltc" (OuterVolumeSpecName: "kube-api-access-gmltc") pod "58a90a80-d6a0-4e65-a801-9a717bc66500" (UID: "58a90a80-d6a0-4e65-a801-9a717bc66500"). InnerVolumeSpecName "kube-api-access-gmltc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:08.985729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.985662 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmltc\" (UniqueName: \"kubernetes.io/projected/58a90a80-d6a0-4e65-a801-9a717bc66500-kube-api-access-gmltc\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:34:08.985729 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:08.985689 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58a90a80-d6a0-4e65-a801-9a717bc66500-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:34:09.036547 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:09.036490 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-s8kcz"] Apr 24 21:34:09.039234 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:34:09.039210 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a9c569a_092b_42d5_ab79_cc933014c58d.slice/crio-3bf983b843b11bc2eb43db1a8e23bbafb71720712e6b9800cca95db683be17e1 WatchSource:0}: Error finding container 3bf983b843b11bc2eb43db1a8e23bbafb71720712e6b9800cca95db683be17e1: Status 404 returned error can't find the container with id 3bf983b843b11bc2eb43db1a8e23bbafb71720712e6b9800cca95db683be17e1 Apr 24 21:34:09.112666 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:09.112636 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-56828"] Apr 24 21:34:09.120627 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:09.120596 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-56828"] Apr 24 21:34:09.219766 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:09.219727 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a90a80-d6a0-4e65-a801-9a717bc66500" path="/var/lib/kubelet/pods/58a90a80-d6a0-4e65-a801-9a717bc66500/volumes" Apr 24 21:34:09.788736 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:09.788708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" event={"ID":"8a9c569a-092b-42d5-ab79-cc933014c58d","Type":"ContainerStarted","Data":"3bf983b843b11bc2eb43db1a8e23bbafb71720712e6b9800cca95db683be17e1"} Apr 24 21:34:10.794114 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:10.794078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" event={"ID":"8a9c569a-092b-42d5-ab79-cc933014c58d","Type":"ContainerStarted","Data":"4134e66e13709c938d374c9d8aeec342b67b9411a4a1c748eedd42a251e6d25b"} Apr 24 21:34:10.794511 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:10.794136 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:34:10.848172 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:10.848118 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" podStartSLOduration=2.16407724 podStartE2EDuration="2.848103703s" podCreationTimestamp="2026-04-24 21:34:08 +0000 UTC" firstStartedPulling="2026-04-24 21:34:09.040543546 +0000 UTC m=+466.443806523" lastFinishedPulling="2026-04-24 21:34:09.724570013 +0000 UTC m=+467.127832986" observedRunningTime="2026-04-24 21:34:10.832174251 +0000 UTC m=+468.235437245" watchObservedRunningTime="2026-04-24 21:34:10.848103703 +0000 UTC m=+468.251366737" Apr 24 21:34:41.802178 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:34:41.802146 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-s8kcz" Apr 24 21:35:11.500723 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.500654 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bb5dfc886-665j6"] Apr 24 21:35:11.501107 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.501065 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58a90a80-d6a0-4e65-a801-9a717bc66500" containerName="manager" Apr 24 21:35:11.501107 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.501080 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a90a80-d6a0-4e65-a801-9a717bc66500" containerName="manager" Apr 24 21:35:11.501182 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.501143 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="58a90a80-d6a0-4e65-a801-9a717bc66500" containerName="manager" Apr 24 21:35:11.504175 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.504157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.529016 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.528989 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb5dfc886-665j6"] Apr 24 21:35:11.613614 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-oauth-serving-cert\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.613767 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-serving-cert\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.613767 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-trusted-ca-bundle\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.613767 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-config\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.613902 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7sh8\" (UniqueName: \"kubernetes.io/projected/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-kube-api-access-w7sh8\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.613902 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-service-ca\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.613988 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.613908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-oauth-config\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714339 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7sh8\" (UniqueName: \"kubernetes.io/projected/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-kube-api-access-w7sh8\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714503 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-service-ca\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714557 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-oauth-config\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714615 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-oauth-serving-cert\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714655 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-serving-cert\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714697 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-trusted-ca-bundle\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.714749 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.714726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-config\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.715096 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.715068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-service-ca\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.715397 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.715377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-config\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.715506 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.715415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-oauth-serving-cert\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.715566 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.715545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-trusted-ca-bundle\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.716994 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.716972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-serving-cert\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.717090 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.717035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-console-oauth-config\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.724245 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.724228 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7sh8\" (UniqueName: \"kubernetes.io/projected/bfa3e3d6-f531-4c9f-a7ec-38513b33fe70-kube-api-access-w7sh8\") pod \"console-6bb5dfc886-665j6\" (UID: \"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70\") " pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:11.813196 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:11.813120 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:12.155102 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:12.155075 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bb5dfc886-665j6"] Apr 24 21:35:12.157049 ip-10-0-131-58 kubenswrapper[2575]: W0424 21:35:12.157025 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa3e3d6_f531_4c9f_a7ec_38513b33fe70.slice/crio-9967153f1383de72305b34a54978c425d19dd0c3eb151d4471c5b1cde83a39b2 WatchSource:0}: Error finding container 9967153f1383de72305b34a54978c425d19dd0c3eb151d4471c5b1cde83a39b2: Status 404 returned error can't find the container with id 9967153f1383de72305b34a54978c425d19dd0c3eb151d4471c5b1cde83a39b2 Apr 24 21:35:13.004318 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:13.004285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb5dfc886-665j6" event={"ID":"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70","Type":"ContainerStarted","Data":"20e20385997c352deef5faa76c7645fb208291f842cae5d9166bc349c6fc4b67"} Apr 24 21:35:13.004318 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:13.004324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bb5dfc886-665j6" event={"ID":"bfa3e3d6-f531-4c9f-a7ec-38513b33fe70","Type":"ContainerStarted","Data":"9967153f1383de72305b34a54978c425d19dd0c3eb151d4471c5b1cde83a39b2"} Apr 24 21:35:13.025306 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:13.025239 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bb5dfc886-665j6" podStartSLOduration=2.025226599 podStartE2EDuration="2.025226599s" podCreationTimestamp="2026-04-24 21:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:35:13.023144887 +0000 UTC m=+530.426407880" watchObservedRunningTime="2026-04-24 21:35:13.025226599 +0000 UTC m=+530.428489591" Apr 24 21:35:21.813820 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:21.813785 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:21.813820 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:21.813818 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:21.818443 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:21.818423 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:22.037948 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:22.037921 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bb5dfc886-665j6" Apr 24 21:35:22.083957 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:22.083859 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7657f89d8-knsrl"] Apr 24 21:35:47.104049 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.103987 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7657f89d8-knsrl" podUID="2b052089-9e8d-4e46-8ee9-849d8cc462a2" containerName="console" containerID="cri-o://20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae" gracePeriod=15 Apr 24 21:35:47.343609 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.343585 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7657f89d8-knsrl_2b052089-9e8d-4e46-8ee9-849d8cc462a2/console/0.log" Apr 24 21:35:47.343746 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.343666 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:35:47.433565 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmptm\" (UniqueName: \"kubernetes.io/projected/2b052089-9e8d-4e46-8ee9-849d8cc462a2-kube-api-access-bmptm\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.433565 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433573 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-config\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.433785 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433596 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-trusted-ca-bundle\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.433785 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433645 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-serving-cert\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.433785 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433660 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-oauth-serving-cert\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.433785 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433681 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-service-ca\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.433785 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.433719 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-oauth-config\") pod \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\" (UID: \"2b052089-9e8d-4e46-8ee9-849d8cc462a2\") " Apr 24 21:35:47.434070 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.434029 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-config" (OuterVolumeSpecName: "console-config") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:47.434138 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.434098 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:47.434138 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.434107 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:47.434138 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.434114 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:47.435766 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.435737 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:47.435899 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.435791 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b052089-9e8d-4e46-8ee9-849d8cc462a2-kube-api-access-bmptm" (OuterVolumeSpecName: "kube-api-access-bmptm") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "kube-api-access-bmptm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:47.435899 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.435790 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2b052089-9e8d-4e46-8ee9-849d8cc462a2" (UID: "2b052089-9e8d-4e46-8ee9-849d8cc462a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:47.535044 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535001 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmptm\" (UniqueName: \"kubernetes.io/projected/2b052089-9e8d-4e46-8ee9-849d8cc462a2-kube-api-access-bmptm\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:47.535044 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535038 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:47.535044 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535052 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-trusted-ca-bundle\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:47.535044 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535061 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-serving-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:47.535318 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535071 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-oauth-serving-cert\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:47.535318 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535080 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b052089-9e8d-4e46-8ee9-849d8cc462a2-service-ca\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:47.535318 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:47.535089 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b052089-9e8d-4e46-8ee9-849d8cc462a2-console-oauth-config\") on node \"ip-10-0-131-58.ec2.internal\" DevicePath \"\"" Apr 24 21:35:48.127590 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.127562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7657f89d8-knsrl_2b052089-9e8d-4e46-8ee9-849d8cc462a2/console/0.log" Apr 24 21:35:48.128066 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.127600 2575 generic.go:358] "Generic (PLEG): container finished" podID="2b052089-9e8d-4e46-8ee9-849d8cc462a2" containerID="20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae" exitCode=2 Apr 24 21:35:48.128066 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.127636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7657f89d8-knsrl" event={"ID":"2b052089-9e8d-4e46-8ee9-849d8cc462a2","Type":"ContainerDied","Data":"20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae"} Apr 24 21:35:48.128066 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.127675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7657f89d8-knsrl" event={"ID":"2b052089-9e8d-4e46-8ee9-849d8cc462a2","Type":"ContainerDied","Data":"0f0c2f3c7b8bbd41f03417bdef650e1e876958e952cadaa04bef11c3772ca66e"} Apr 24 21:35:48.128066 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.127693 2575 scope.go:117] "RemoveContainer" containerID="20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae" Apr 24 21:35:48.128066 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.127698 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7657f89d8-knsrl" Apr 24 21:35:48.135911 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.135891 2575 scope.go:117] "RemoveContainer" containerID="20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae" Apr 24 21:35:48.136161 ip-10-0-131-58 kubenswrapper[2575]: E0424 21:35:48.136143 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae\": container with ID starting with 20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae not found: ID does not exist" containerID="20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae" Apr 24 21:35:48.136227 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.136175 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae"} err="failed to get container status \"20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae\": rpc error: code = NotFound desc = could not find container \"20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae\": container with ID starting with 20ded4ed88325f4c4d79738a3f29411f80364cf7d8cde2866077169be6f0b2ae not found: ID does not exist" Apr 24 21:35:48.151106 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.151081 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7657f89d8-knsrl"] Apr 24 21:35:48.160120 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:48.160098 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7657f89d8-knsrl"] Apr 24 21:35:49.219192 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:35:49.219155 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b052089-9e8d-4e46-8ee9-849d8cc462a2" path="/var/lib/kubelet/pods/2b052089-9e8d-4e46-8ee9-849d8cc462a2/volumes" Apr 24 21:36:23.154925 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:36:23.154894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:36:23.156011 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:36:23.155992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:36:23.158328 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:36:23.158302 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:36:23.159272 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:36:23.159245 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:41:23.180474 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:41:23.180391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:41:23.183570 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:41:23.183524 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:41:23.184552 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:41:23.184522 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:41:23.187195 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:41:23.187178 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:46:23.207792 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:46:23.207762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:46:23.211000 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:46:23.210978 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:46:23.211736 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:46:23.211719 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:46:23.215731 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:46:23.215707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:51:23.233816 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:51:23.233786 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:51:23.236822 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:51:23.236802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:51:23.240818 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:51:23.240799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:51:23.243926 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:51:23.243906 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:56:23.258765 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:56:23.258736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:56:23.261803 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:56:23.261777 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 21:56:23.267116 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:56:23.267094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 21:56:23.270354 ip-10-0-131-58 kubenswrapper[2575]: I0424 21:56:23.270335 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:01:23.284011 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:01:23.283983 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:01:23.287393 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:01:23.287368 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:01:23.293984 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:01:23.293966 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:01:23.297099 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:01:23.297084 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:06:23.309852 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:06:23.309822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:06:23.312958 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:06:23.312938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:06:23.320404 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:06:23.320376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:06:23.323342 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:06:23.323324 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:11:23.335134 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:11:23.335056 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:11:23.338139 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:11:23.338119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:11:23.346583 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:11:23.346565 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:11:23.349642 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:11:23.349612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:16:23.359616 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:16:23.359588 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:16:23.362940 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:16:23.362919 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:16:23.373007 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:16:23.372987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:16:23.375882 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:16:23.375866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:21:23.384128 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:21:23.384102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:21:23.387649 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:21:23.387627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:21:23.398182 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:21:23.398163 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:21:23.401140 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:21:23.401125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:26:23.409062 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:26:23.409034 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:26:23.412384 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:26:23.412366 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:26:23.424112 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:26:23.424091 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:26:23.427056 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:26:23.427040 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:31:23.435645 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:23.435616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:31:23.439214 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:23.439185 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:31:23.449997 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:23.449976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:31:23.452766 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:23.452745 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:31:38.341812 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:38.341783 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dlzdh_d9049978-94b6-422d-bab2-7c826163ffc7/global-pull-secret-syncer/0.log" Apr 24 22:31:38.447911 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:38.447872 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-58m8w_5259dcbe-abac-4ee3-bd35-66dab5614ebd/konnectivity-agent/0.log" Apr 24 22:31:38.574214 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:38.574180 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-58.ec2.internal_9ec5360dd3ff0d724086a47bd35d554c/haproxy/0.log" Apr 24 22:31:41.892522 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:41.892491 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/alertmanager/0.log" Apr 24 22:31:41.916725 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:41.916697 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/config-reloader/0.log" Apr 24 22:31:41.942821 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:41.942793 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/kube-rbac-proxy-web/0.log" Apr 24 22:31:41.964187 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:41.964161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/kube-rbac-proxy/0.log" Apr 24 22:31:41.984636 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:41.984612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/kube-rbac-proxy-metric/0.log" Apr 24 22:31:42.013944 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.013915 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/prom-label-proxy/0.log" Apr 24 22:31:42.041421 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.041396 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_432ff1c4-1164-4013-8d70-0837a758df1e/init-config-reloader/0.log" Apr 24 22:31:42.106428 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.106401 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jrjgl_6e5f2196-7905-4092-b23c-1f63b39dc528/kube-state-metrics/0.log" Apr 24 22:31:42.126219 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.126194 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jrjgl_6e5f2196-7905-4092-b23c-1f63b39dc528/kube-rbac-proxy-main/0.log" Apr 24 22:31:42.148733 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.148676 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jrjgl_6e5f2196-7905-4092-b23c-1f63b39dc528/kube-rbac-proxy-self/0.log" Apr 24 22:31:42.177704 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.177681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-54b7659cfc-zjvjv_80c96c6d-7367-4d83-8102-18bfbb2ad8c8/metrics-server/0.log" Apr 24 22:31:42.311599 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.311570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kkghd_14046a77-27b2-4686-90d9-2b6f59d97707/node-exporter/0.log" Apr 24 22:31:42.331002 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.330977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kkghd_14046a77-27b2-4686-90d9-2b6f59d97707/kube-rbac-proxy/0.log" Apr 24 22:31:42.355359 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.355335 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-kkghd_14046a77-27b2-4686-90d9-2b6f59d97707/init-textfile/0.log" Apr 24 22:31:42.463330 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.463226 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t4gjz_cf2ad258-2bb0-493a-98f7-41c11632fd79/kube-rbac-proxy-main/0.log" Apr 24 22:31:42.484898 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.484860 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t4gjz_cf2ad258-2bb0-493a-98f7-41c11632fd79/kube-rbac-proxy-self/0.log" Apr 24 22:31:42.507590 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.507564 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t4gjz_cf2ad258-2bb0-493a-98f7-41c11632fd79/openshift-state-metrics/0.log" Apr 24 22:31:42.762835 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.762764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xghjh_8edc3df6-4a9e-45bf-bfea-3bbff392cd1d/prometheus-operator-admission-webhook/0.log" Apr 24 22:31:42.868403 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.868375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/thanos-query/0.log" Apr 24 22:31:42.889795 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.889771 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy-web/0.log" Apr 24 22:31:42.910352 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.910327 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy/0.log" Apr 24 22:31:42.932995 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.932972 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/prom-label-proxy/0.log" Apr 24 22:31:42.954497 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.954444 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy-rules/0.log" Apr 24 22:31:42.978817 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:42.978797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79bfcc9858-2dmtl_cf5c2fee-0804-4913-a2a0-ee03634c1f56/kube-rbac-proxy-metrics/0.log" Apr 24 22:31:44.111271 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:44.111230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mdm8g_c652c05d-6547-4c43-a295-52b3275ef5e0/networking-console-plugin/0.log" Apr 24 22:31:44.524245 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:44.524216 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/2.log" Apr 24 22:31:44.528686 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:44.528665 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mksv7_057fa386-6c87-478a-91d9-c2293ba0617c/console-operator/3.log" Apr 24 22:31:44.906286 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:44.906241 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bb5dfc886-665j6_bfa3e3d6-f531-4c9f-a7ec-38513b33fe70/console/0.log" Apr 24 22:31:44.941413 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:44.941389 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-dp4l7_ea0e6e62-3775-4664-9ee5-50d8d29af322/download-server/0.log" Apr 24 22:31:45.424910 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.424880 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w"] Apr 24 22:31:45.425303 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.425286 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b052089-9e8d-4e46-8ee9-849d8cc462a2" containerName="console" Apr 24 22:31:45.425382 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.425305 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b052089-9e8d-4e46-8ee9-849d8cc462a2" containerName="console" Apr 24 22:31:45.425434 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.425404 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b052089-9e8d-4e46-8ee9-849d8cc462a2" containerName="console" Apr 24 22:31:45.428554 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.428534 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.430626 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.430600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4vvx\"/\"openshift-service-ca.crt\"" Apr 24 22:31:45.430725 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.430600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-j4vvx\"/\"default-dockercfg-c2k9q\"" Apr 24 22:31:45.431413 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.431391 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-j4vvx\"/\"kube-root-ca.crt\"" Apr 24 22:31:45.436773 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.436749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w"] Apr 24 22:31:45.616222 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.616188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-lib-modules\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.616222 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.616222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-sys\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.616451 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.616275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbd4k\" (UniqueName: \"kubernetes.io/projected/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-kube-api-access-zbd4k\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.616451 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.616340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-podres\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.616451 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.616383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-proc\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717671 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-podres\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717671 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717629 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-proc\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-lib-modules\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-sys\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-podres\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-proc\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbd4k\" (UniqueName: \"kubernetes.io/projected/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-kube-api-access-zbd4k\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-sys\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.717885 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.717854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-lib-modules\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.725765 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.725735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbd4k\" (UniqueName: \"kubernetes.io/projected/cbe384e9-ecb0-4b13-8979-6b4bba37bfee-kube-api-access-zbd4k\") pod \"perf-node-gather-daemonset-nnf6w\" (UID: \"cbe384e9-ecb0-4b13-8979-6b4bba37bfee\") " pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.739669 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.739649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:45.860202 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.860175 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w"] Apr 24 22:31:45.862396 ip-10-0-131-58 kubenswrapper[2575]: W0424 22:31:45.862367 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcbe384e9_ecb0_4b13_8979_6b4bba37bfee.slice/crio-cc543637dd6a82bd4b3ee3770de1fad34489bd57ccff808f15609fdd9376e7bd WatchSource:0}: Error finding container cc543637dd6a82bd4b3ee3770de1fad34489bd57ccff808f15609fdd9376e7bd: Status 404 returned error can't find the container with id cc543637dd6a82bd4b3ee3770de1fad34489bd57ccff808f15609fdd9376e7bd Apr 24 22:31:45.864131 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:45.864110 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:31:46.018470 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.018411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lrxx8_5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33/dns/0.log" Apr 24 22:31:46.037992 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.037973 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lrxx8_5f2b71f9-7480-4ae7-96bd-b2fa0ef3ae33/kube-rbac-proxy/0.log" Apr 24 22:31:46.151789 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.151756 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r2v97_92515838-c368-40ef-9e3e-40f753dd0308/dns-node-resolver/0.log" Apr 24 22:31:46.567400 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.567361 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" event={"ID":"cbe384e9-ecb0-4b13-8979-6b4bba37bfee","Type":"ContainerStarted","Data":"82b825cdff834d68717f8fad9ecda4e5e198546ba01d38eadd4e5556dc86e17e"} Apr 24 22:31:46.567400 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.567402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" event={"ID":"cbe384e9-ecb0-4b13-8979-6b4bba37bfee","Type":"ContainerStarted","Data":"cc543637dd6a82bd4b3ee3770de1fad34489bd57ccff808f15609fdd9376e7bd"} Apr 24 22:31:46.567823 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.567454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:46.585357 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.585115 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" podStartSLOduration=1.585097357 podStartE2EDuration="1.585097357s" podCreationTimestamp="2026-04-24 22:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:46.584949459 +0000 UTC m=+3923.988212458" watchObservedRunningTime="2026-04-24 22:31:46.585097357 +0000 UTC m=+3923.988360352" Apr 24 22:31:46.619614 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:46.619544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j8p2r_9062ddad-735b-4bad-80c5-03e7de6d3add/node-ca/0.log" Apr 24 22:31:47.660905 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:47.660868 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6hc5g_7eb5a6c5-93dc-4c4b-a6ab-b457966b4540/serve-healthcheck-canary/0.log" Apr 24 22:31:48.192913 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:48.192887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-znrss_f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5/kube-rbac-proxy/0.log" Apr 24 22:31:48.214939 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:48.214908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-znrss_f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5/exporter/0.log" Apr 24 22:31:48.234097 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:48.234072 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-znrss_f4902d02-2d55-4fcb-b3bc-5f3dd8f847a5/extractor/0.log" Apr 24 22:31:50.160649 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:50.160612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-67f77cd7d7-s8kcz_8a9c569a-092b-42d5-ab79-cc933014c58d/manager/0.log" Apr 24 22:31:50.180570 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:50.180538 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-c4r4g_b7418663-d570-4a1d-8538-7f12b6945e7a/manager/0.log" Apr 24 22:31:52.581094 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:52.581065 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-j4vvx/perf-node-gather-daemonset-nnf6w" Apr 24 22:31:54.629171 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:54.629138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8t597_4a70399d-9352-4aae-bc56-8a355a0872ff/migrator/0.log" Apr 24 22:31:54.651883 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:54.651853 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8t597_4a70399d-9352-4aae-bc56-8a355a0872ff/graceful-termination/0.log" Apr 24 22:31:55.986677 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:55.986648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/kube-multus-additional-cni-plugins/0.log" Apr 24 22:31:56.013106 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.013081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/egress-router-binary-copy/0.log" Apr 24 22:31:56.040968 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.040947 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/cni-plugins/0.log" Apr 24 22:31:56.063212 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.063192 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/bond-cni-plugin/0.log" Apr 24 22:31:56.089010 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.088988 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/routeoverride-cni/0.log" Apr 24 22:31:56.110347 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.110321 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/whereabouts-cni-bincopy/0.log" Apr 24 22:31:56.131782 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.131759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fhmlv_8de213de-0f31-4d5f-9d53-9ba716ac7760/whereabouts-cni/0.log" Apr 24 22:31:56.527957 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.527930 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g7w5g_d0286bfd-3ba9-4b8c-839e-f415766385d0/kube-multus/0.log" Apr 24 22:31:56.699072 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.699041 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nz6dq_3ad3b358-912b-477a-8fc3-6f2910580c33/network-metrics-daemon/0.log" Apr 24 22:31:56.720576 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:56.720540 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nz6dq_3ad3b358-912b-477a-8fc3-6f2910580c33/kube-rbac-proxy/0.log" Apr 24 22:31:57.474091 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.474061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-controller/0.log" Apr 24 22:31:57.503354 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.503329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/0.log" Apr 24 22:31:57.520815 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.520799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovn-acl-logging/1.log" Apr 24 22:31:57.544318 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.544297 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/kube-rbac-proxy-node/0.log" Apr 24 22:31:57.577110 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.577088 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:31:57.600911 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.600887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/northd/0.log" Apr 24 22:31:57.631620 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.631594 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/nbdb/0.log" Apr 24 22:31:57.669402 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.669370 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/sbdb/0.log" Apr 24 22:31:57.776546 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:57.776472 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-78cw4_8d10bb71-bb18-4fc8-8721-420e294ce6ab/ovnkube-controller/0.log" Apr 24 22:31:59.524515 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:59.524484 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4nlnm_c877e9e0-0af6-4ded-a68a-810fa0ab4f8e/check-endpoints/0.log" Apr 24 22:31:59.573388 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:31:59.573361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8cswd_42ee90c2-09f5-4464-a75c-62352a375c5a/network-check-target-container/0.log" Apr 24 22:32:00.494446 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:32:00.494411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zs4vw_9cd5c45c-4069-4996-a64a-d57b60694538/iptables-alerter/0.log" Apr 24 22:32:01.106024 ip-10-0-131-58 kubenswrapper[2575]: I0424 22:32:01.105987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-54pzv_e9ea7b9c-f49c-4584-aabf-ed26a2c488b9/tuned/0.log"