Apr 22 18:20:36.163493 ip-10-0-140-74 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:20:36.649461 ip-10-0-140-74 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:36.649461 ip-10-0-140-74 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:20:36.649461 ip-10-0-140-74 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:36.649461 ip-10-0-140-74 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:20:36.649888 ip-10-0-140-74 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:36.650603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.650544 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:20:36.653629 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653615 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:36.653629 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653629 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653633 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653637 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653640 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653643 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653646 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653649 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653653 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653656 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653659 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653662 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653665 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653668 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653672 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653676 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653679 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653682 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653685 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653688 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:36.653689 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653697 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653701 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653704 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653707 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653709 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653712 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653715 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653718 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653721 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653725 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653728 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653731 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653733 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653736 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653739 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653741 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653744 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653746 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653749 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:36.654140 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653751 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653754 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653756 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653759 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653761 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653764 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653766 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653769 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653771 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653774 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653776 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653779 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653781 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653784 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653786 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653790 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653793 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653796 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653799 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653801 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:36.654612 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653804 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653807 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653810 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653812 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653815 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653817 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653820 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653823 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653826 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653829 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653832 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653834 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653837 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653840 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653842 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653845 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653848 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653850 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653853 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653855 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:36.655099 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653858 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653861 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653864 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653866 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653869 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653871 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.653874 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654223 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654229 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654232 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654235 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654238 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654240 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654256 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654258 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654261 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654264 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654266 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654269 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654272 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:36.655604 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654274 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654277 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654280 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654282 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654285 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654287 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654290 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654293 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654296 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654299 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654301 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654304 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654306 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654310 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654312 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654315 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654318 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654320 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654323 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654326 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:36.656077 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654329 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654331 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654334 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654338 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654343 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654345 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654348 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654350 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654353 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654355 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654358 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654361 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654363 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654365 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654368 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654370 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654373 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654375 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654378 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654380 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:36.656591 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654383 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654386 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654388 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654391 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654393 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654396 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654399 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654402 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654405 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654407 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654410 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654413 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654416 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654418 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654421 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654423 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654426 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654428 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654433 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654437 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:36.657070 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654439 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654442 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654445 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654448 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654451 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654453 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654456 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654458 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654461 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654463 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654465 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654468 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.654471 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655939 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655949 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655955 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655960 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655967 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655970 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655974 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655978 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:20:36.657555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655982 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655985 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655989 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655992 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655995 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.655998 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656001 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656004 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656007 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656010 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656013 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656019 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656022 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656025 2575 flags.go:64] FLAG: --config-dir="" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656028 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656031 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656035 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656038 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656043 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656046 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656049 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656052 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656055 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656058 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656061 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:20:36.658077 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656066 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656070 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656074 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656076 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656080 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656083 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656087 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656090 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656094 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656097 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656100 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656104 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656106 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656110 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656113 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656116 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656119 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656122 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656125 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656128 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656131 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656134 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656138 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656141 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656144 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:20:36.658690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656149 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656152 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656155 2575 flags.go:64] FLAG: --help="false" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656158 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-140-74.ec2.internal" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656161 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656164 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656167 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656170 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656174 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656178 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656181 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656184 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656187 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656190 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656194 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656197 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656200 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656202 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656205 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656208 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656211 2575 flags.go:64] FLAG: --lock-file="" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656214 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656217 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656220 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:20:36.659341 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656225 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656228 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656231 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656234 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656237 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656251 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656257 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656260 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656266 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656269 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656273 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656276 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656279 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656282 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656285 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656288 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656291 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656294 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656302 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656305 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656308 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656311 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656314 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:20:36.659892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656328 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656332 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656336 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656339 2575 flags.go:64] FLAG: --port="10250" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656342 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656344 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03cc6b392ce21203f" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656348 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656351 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656354 2575 flags.go:64] FLAG: --register-node="true" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656357 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656359 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656363 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656366 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656369 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656372 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656376 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656380 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656384 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656387 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656390 2575 flags.go:64] FLAG: --runonce="false" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656393 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656396 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656399 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656402 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656405 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656408 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:20:36.660516 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656411 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656414 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656417 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656420 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656423 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656426 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656429 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656432 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656435 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656440 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656443 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656446 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656450 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656453 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656455 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656458 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656461 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656464 2575 flags.go:64] FLAG: --v="2" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656468 2575 flags.go:64] FLAG: --version="false" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656472 2575 flags.go:64] FLAG: --vmodule="" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656476 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.656480 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656573 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656578 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:36.661177 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656581 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656584 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656587 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656590 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656593 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656596 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656599 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656602 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656605 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656607 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656610 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656614 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656616 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656619 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656622 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656624 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656627 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656629 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656632 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656635 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:36.661756 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656638 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656640 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656643 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656645 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656648 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656650 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656653 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656655 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656658 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656661 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656664 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656668 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656671 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656673 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656676 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656679 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656681 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656684 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656687 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656689 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:36.662288 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656692 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656694 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656697 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656700 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656703 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656706 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656709 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656711 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656714 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656717 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656720 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656724 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656727 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656729 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656732 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656735 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656737 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656740 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656742 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656745 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:36.662777 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656748 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656752 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656756 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656760 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656763 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656766 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656769 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656772 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656774 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656777 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656779 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656782 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656784 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656787 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656789 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656792 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656794 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656798 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656800 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:36.663325 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656803 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:36.663785 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656805 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:36.663785 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656808 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:36.663785 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656810 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:36.663785 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.656813 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:36.663785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.657540 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:36.664062 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.664045 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:20:36.664092 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.664062 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:20:36.664121 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664107 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:36.664121 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664112 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:36.664121 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664115 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:36.664121 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664118 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:36.664121 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664121 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664124 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664127 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664130 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664133 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664136 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664138 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664141 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664145 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664149 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664152 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664155 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664159 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664161 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664164 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664167 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664170 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664172 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664176 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:36.664257 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664178 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664181 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664184 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664186 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664189 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664191 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664194 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664196 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664199 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664201 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664204 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664206 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664208 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664211 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664214 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664218 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664222 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664226 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664228 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664231 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:36.664758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664234 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664236 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664239 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664256 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664260 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664263 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664265 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664268 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664271 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664274 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664276 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664279 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664282 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664285 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664287 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664290 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664293 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664295 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664298 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664300 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:36.665260 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664303 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664306 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664308 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664310 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664313 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664316 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664324 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664330 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664333 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664336 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664339 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664342 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664344 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664347 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664349 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664352 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664355 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664357 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664360 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:36.665745 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664362 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664365 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664367 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664370 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.664375 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664471 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664476 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664479 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664481 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664484 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664487 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664490 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664493 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664495 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664498 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:36.666204 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664500 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664503 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664506 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664508 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664511 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664515 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664519 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664522 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664524 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664527 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664529 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664533 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664537 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664539 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664543 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664546 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664548 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664551 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664554 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:36.666573 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664556 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664559 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664561 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664564 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664566 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664570 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664572 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664575 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664577 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664580 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664582 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664584 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664587 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664590 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664592 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664595 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664597 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664600 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664603 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664606 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:36.667038 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664609 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664611 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664613 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664616 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664619 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664621 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664623 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664626 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664629 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664631 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664634 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664636 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664638 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664658 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664662 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664665 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664668 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664670 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664674 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664676 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:36.667531 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664679 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664682 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664684 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664686 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664689 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664692 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664694 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664697 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664699 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664701 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664704 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664707 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664709 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664712 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664714 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664716 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:36.668000 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:36.664719 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:36.668397 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.664724 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:36.668397 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.664811 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:20:36.668943 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.668929 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:20:36.670265 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.670255 2575 server.go:1019] "Starting client certificate rotation" Apr 22 18:20:36.670363 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.670347 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:36.670989 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.670979 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:36.699658 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.699642 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:36.704584 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.704558 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:36.721917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.721898 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:20:36.727830 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.727813 2575 log.go:25] "Validated CRI v1 image API" Apr 22 18:20:36.728557 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.728541 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:36.729483 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.729462 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:20:36.733625 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.733607 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7b1317e6-36b2-4dc8-9f0c-5d1cb7725a3d:/dev/nvme0n1p4 f8c2c2a4-3bc3-4796-bffd-49557b94f1de:/dev/nvme0n1p3] Apr 22 18:20:36.733686 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.733624 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:20:36.739127 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.739007 2575 manager.go:217] Machine: {Timestamp:2026-04-22 18:20:36.737180973 +0000 UTC m=+0.446526618 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097082 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec256a9c9fa1d6686db12160a2296fd7 SystemUUID:ec256a9c-9fa1-d668-6db1-2160a2296fd7 BootID:224433a0-c5a6-4a62-b9e1-8ce033e65615 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:04:21:0d:65:1f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:04:21:0d:65:1f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:0b:a6:08:e0:ad Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:20:36.739593 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.739583 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:20:36.739684 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.739673 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:20:36.740774 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.740749 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:20:36.740915 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.740775 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-74.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:20:36.740957 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.740921 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:20:36.740957 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.740929 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:20:36.740957 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.740941 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:36.742000 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.741990 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:36.743399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.743389 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:36.743583 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.743574 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:20:36.746210 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.746196 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:20:36.746254 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.746215 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:20:36.746254 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.746226 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:20:36.746254 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.746235 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:20:36.746374 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.746256 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:20:36.747396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.747384 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:36.747444 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.747402 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:36.752270 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.752236 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:20:36.756367 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.756345 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:20:36.757850 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757835 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757855 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757866 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757891 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757915 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757923 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757929 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:20:36.757933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757935 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:20:36.758118 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757948 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:20:36.758118 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757955 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:20:36.758118 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757969 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:20:36.758118 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.757984 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:20:36.759725 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.759711 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:20:36.759763 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.759742 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:20:36.762183 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.762164 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:20:36.762410 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.762387 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-74.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:20:36.763379 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.763365 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:20:36.763423 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.763399 2575 server.go:1295] "Started kubelet" Apr 22 18:20:36.763507 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.763484 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:20:36.763555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.763482 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:20:36.763555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.763531 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:20:36.764082 ip-10-0-140-74 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:20:36.765416 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.765323 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:20:36.766069 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.766055 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:20:36.770369 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.770351 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:36.770971 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.770956 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:20:36.771156 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.771131 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-74.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:20:36.771751 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.771737 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:20:36.771751 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.771753 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:20:36.771834 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.771817 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:20:36.771872 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.771863 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:20:36.771872 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.771872 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:20:36.771945 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.771923 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:36.772036 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772026 2575 factory.go:55] Registering systemd factory Apr 22 18:20:36.772093 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772044 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:20:36.772143 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.771040 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-74.ec2.internal.18a8c0cc510dc2a4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-74.ec2.internal,UID:ip-10-0-140-74.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-74.ec2.internal,},FirstTimestamp:2026-04-22 18:20:36.76337834 +0000 UTC m=+0.472723984,LastTimestamp:2026-04-22 18:20:36.76337834 +0000 UTC m=+0.472723984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-74.ec2.internal,}" Apr 22 18:20:36.772220 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772210 2575 factory.go:153] Registering CRI-O factory Apr 22 18:20:36.772220 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772221 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 18:20:36.772521 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772476 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:20:36.772521 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772505 2575 factory.go:103] Registering Raw factory Apr 22 18:20:36.772648 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772541 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 18:20:36.772648 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.772602 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:20:36.773091 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.772980 2575 manager.go:319] Starting recovery of all containers Apr 22 18:20:36.778514 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.778493 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:20:36.778901 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.778875 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-74.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:20:36.784395 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.784365 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79dd2" Apr 22 18:20:36.784891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.784870 2575 manager.go:324] Recovery completed Apr 22 18:20:36.789394 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.789381 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:36.791576 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.791561 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:36.791637 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.791586 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:36.791637 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.791595 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:36.792072 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.792058 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:20:36.792072 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.792070 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:20:36.792156 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.792086 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:36.793305 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.793288 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79dd2" Apr 22 18:20:36.794817 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.794762 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-74.ec2.internal.18a8c0cc52bc00d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-74.ec2.internal,UID:ip-10-0-140-74.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-74.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-74.ec2.internal,},FirstTimestamp:2026-04-22 18:20:36.791574738 +0000 UTC m=+0.500920382,LastTimestamp:2026-04-22 18:20:36.791574738 +0000 UTC m=+0.500920382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-74.ec2.internal,}" Apr 22 18:20:36.795128 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.795113 2575 policy_none.go:49] "None policy: Start" Apr 22 18:20:36.795182 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.795130 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:20:36.795182 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.795140 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:20:36.831779 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.831764 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.831827 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.831840 2575 server.go:85] "Starting device plugin registration server" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.832035 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.832046 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.832145 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.832223 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.832233 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.832624 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:20:36.858891 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.832653 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:36.904096 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.904073 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:20:36.905207 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.905187 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:20:36.905292 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.905212 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:20:36.905292 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.905227 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:20:36.905292 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.905234 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:20:36.905398 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.905277 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:20:36.907310 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.907293 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:36.932852 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.932834 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:36.933512 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.933496 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:36.933587 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.933525 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:36.933587 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.933537 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:36.933587 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.933556 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-74.ec2.internal" Apr 22 18:20:36.940348 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:36.940335 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-74.ec2.internal" Apr 22 18:20:36.940393 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.940353 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-74.ec2.internal\": node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:36.960478 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:36.960454 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.006407 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.006378 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal"] Apr 22 18:20:37.006460 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.006447 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:37.007173 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.007158 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:37.007233 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.007186 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:37.007233 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.007199 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:37.009313 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009300 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:37.009447 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009432 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.009506 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009472 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:37.009926 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009898 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:37.009926 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009926 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:37.009926 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009899 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:37.010084 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009936 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:37.010084 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009965 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:37.010084 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.009978 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:37.011931 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.011914 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.012005 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.011940 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:37.012718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.012702 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:37.012805 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.012729 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:37.012805 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.012743 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:37.032788 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.032770 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-74.ec2.internal\" not found" node="ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.037239 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.037225 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-74.ec2.internal\" not found" node="ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.060814 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.060796 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.072694 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.072675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f6884c0e78773f3a19b1cc8342e7247-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal\" (UID: \"9f6884c0e78773f3a19b1cc8342e7247\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.072770 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.072698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f6884c0e78773f3a19b1cc8342e7247-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal\" (UID: \"9f6884c0e78773f3a19b1cc8342e7247\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.072770 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.072715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/075634087ab8a39514ea4cf278614518-config\") pod \"kube-apiserver-proxy-ip-10-0-140-74.ec2.internal\" (UID: \"075634087ab8a39514ea4cf278614518\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.161496 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.161470 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.173858 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.173839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f6884c0e78773f3a19b1cc8342e7247-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal\" (UID: \"9f6884c0e78773f3a19b1cc8342e7247\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.173916 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.173864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f6884c0e78773f3a19b1cc8342e7247-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal\" (UID: \"9f6884c0e78773f3a19b1cc8342e7247\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.173916 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.173885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/075634087ab8a39514ea4cf278614518-config\") pod \"kube-apiserver-proxy-ip-10-0-140-74.ec2.internal\" (UID: \"075634087ab8a39514ea4cf278614518\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.173916 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.173910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/075634087ab8a39514ea4cf278614518-config\") pod \"kube-apiserver-proxy-ip-10-0-140-74.ec2.internal\" (UID: \"075634087ab8a39514ea4cf278614518\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.174070 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.173947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f6884c0e78773f3a19b1cc8342e7247-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal\" (UID: \"9f6884c0e78773f3a19b1cc8342e7247\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.174070 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.173947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f6884c0e78773f3a19b1cc8342e7247-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal\" (UID: \"9f6884c0e78773f3a19b1cc8342e7247\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.262238 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.262183 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.334753 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.334725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.339582 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.339563 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" Apr 22 18:20:37.362934 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.362904 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.463438 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.463405 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.564015 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.563961 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.664466 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.664443 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.669555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.669537 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:20:37.669681 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.669656 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:37.765424 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.765408 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.770557 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.770538 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:37.790527 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.790502 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:37.794784 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.794758 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:15:36 +0000 UTC" deadline="2028-01-12 13:21:11.05201945 +0000 UTC" Apr 22 18:20:37.794784 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.794782 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15115h0m33.257240001s" Apr 22 18:20:37.810780 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.810762 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-p944k" Apr 22 18:20:37.816782 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.816743 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-p944k" Apr 22 18:20:37.866221 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.866203 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.887684 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:37.887641 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075634087ab8a39514ea4cf278614518.slice/crio-73e472cc01166d326745b961dd5d224f404316a1830e8390befe770270597bf3 WatchSource:0}: Error finding container 73e472cc01166d326745b961dd5d224f404316a1830e8390befe770270597bf3: Status 404 returned error can't find the container with id 73e472cc01166d326745b961dd5d224f404316a1830e8390befe770270597bf3 Apr 22 18:20:37.888226 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:37.888203 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f6884c0e78773f3a19b1cc8342e7247.slice/crio-26dc5b93813d88156ba24e014d1db8de714a50d362542fe59f52cf1e673bbfbb WatchSource:0}: Error finding container 26dc5b93813d88156ba24e014d1db8de714a50d362542fe59f52cf1e673bbfbb: Status 404 returned error can't find the container with id 26dc5b93813d88156ba24e014d1db8de714a50d362542fe59f52cf1e673bbfbb Apr 22 18:20:37.892720 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.892704 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:20:37.907687 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.907642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" event={"ID":"075634087ab8a39514ea4cf278614518","Type":"ContainerStarted","Data":"73e472cc01166d326745b961dd5d224f404316a1830e8390befe770270597bf3"} Apr 22 18:20:37.909307 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.909288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" event={"ID":"9f6884c0e78773f3a19b1cc8342e7247","Type":"ContainerStarted","Data":"26dc5b93813d88156ba24e014d1db8de714a50d362542fe59f52cf1e673bbfbb"} Apr 22 18:20:37.966472 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:37.966449 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:37.982230 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:37.982209 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:38.067385 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.067330 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:38.168055 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.168035 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:38.168450 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.168434 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:38.269080 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.269058 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-74.ec2.internal\" not found" Apr 22 18:20:38.323605 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.323421 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:38.371744 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.371570 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" Apr 22 18:20:38.388082 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.387955 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:38.389122 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.388966 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" Apr 22 18:20:38.397226 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.397208 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:38.748342 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.748312 2575 apiserver.go:52] "Watching apiserver" Apr 22 18:20:38.754063 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.754039 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:20:38.754481 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.754452 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb","openshift-dns/node-resolver-xxh8b","openshift-image-registry/node-ca-4nkxw","openshift-multus/multus-slx9s","openshift-multus/network-metrics-daemon-dhwbm","openshift-network-diagnostics/network-check-target-tx2k7","kube-system/konnectivity-agent-fklgp","kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal","openshift-cluster-node-tuning-operator/tuned-5r97b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal","openshift-multus/multus-additional-cni-plugins-s9sjx","openshift-network-operator/iptables-alerter-tjcwf","openshift-ovn-kubernetes/ovnkube-node-s8sh5"] Apr 22 18:20:38.757953 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.757931 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.760484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.760149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.760484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.760428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:20:38.760643 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.760495 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.760643 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.760511 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.760643 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.760587 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h9dx4\"" Apr 22 18:20:38.762400 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.762380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d6wz6\"" Apr 22 18:20:38.762824 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.762705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:20:38.762824 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.762731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:20:38.764636 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.764597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.766727 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.766711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.766807 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.766711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:38.766846 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.766828 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.766954 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.766822 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:38.767823 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.767802 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:20:38.767889 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.767842 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-297ng\"" Apr 22 18:20:38.768064 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.768049 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:20:38.768102 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.768084 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.769009 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.768975 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:20:38.769126 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.769024 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.769222 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.769134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:38.769222 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.769189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:38.769369 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.769227 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r6fxt\"" Apr 22 18:20:38.769369 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.769335 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.771338 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.771320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.774693 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.773888 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.774693 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.773964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.774693 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.774205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cd65k\"" Apr 22 18:20:38.775228 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.775202 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.776398 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.776375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.777101 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.776844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qk7fh\"" Apr 22 18:20:38.777101 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.776863 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.777331 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.777293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.779446 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.779424 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:20:38.779563 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.779500 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:20:38.779728 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.779708 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-48pbs\"" Apr 22 18:20:38.779974 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.779948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.782109 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-modprobe-d\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.782203 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-systemd\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.782203 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-device-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.782203 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-cnibin\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.782385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.782385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-conf-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.782385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbwg\" (UniqueName: \"kubernetes.io/projected/1f085cfa-07bb-457b-85ce-79f190f3ecb1-kube-api-access-jrbwg\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:38.782385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysconfig\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.782385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-sys\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-socket-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-os-release\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782476 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-netns\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-kubelet\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.782629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-etc-kubernetes\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6fx\" (UniqueName: \"kubernetes.io/projected/fb638d9e-ea2e-4a2e-979e-308022903fd1-kube-api-access-fl6fx\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782696 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-system-cni-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vmtz5\"" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-k8s-cni-cncf-io\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-cni-bin\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-daemon-config\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-multus-certs\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/073609fd-8186-41f7-860d-4fd136656e3f-host\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-kubernetes\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-sys-fs\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.782998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zcv\" (UniqueName: \"kubernetes.io/projected/8d08824f-8af6-46ef-b54e-409f85817ae0-kube-api-access-q7zcv\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-socket-dir-parent\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783062 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lsrn\" (UniqueName: \"kubernetes.io/projected/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-kube-api-access-8lsrn\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysctl-d\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-tuned\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a492888-e464-4422-ac7a-f260e0cd42aa-tmp\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-cni-binary-copy\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-hostroot\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/073609fd-8186-41f7-860d-4fd136656e3f-serviceca\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d36f1e9-9321-4fbf-935b-023f00adbb68-konnectivity-ca\") pod \"konnectivity-agent-fklgp\" (UID: \"7d36f1e9-9321-4fbf-935b-023f00adbb68\") " pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-run\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-var-lib-kubelet\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-host\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xvr\" (UniqueName: \"kubernetes.io/projected/1a492888-e464-4422-ac7a-f260e0cd42aa-kube-api-access-v6xvr\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-registration-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-cni-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzcqb\" (UniqueName: \"kubernetes.io/projected/073609fd-8186-41f7-860d-4fd136656e3f-kube-api-access-rzcqb\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.783718 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysctl-conf\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-lib-modules\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-etc-selinux\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-cni-multus\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d36f1e9-9321-4fbf-935b-023f00adbb68-agent-certs\") pod \"konnectivity-agent-fklgp\" (UID: \"7d36f1e9-9321-4fbf-935b-023f00adbb68\") " pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb638d9e-ea2e-4a2e-979e-308022903fd1-hosts-file\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.784366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.783909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb638d9e-ea2e-4a2e-979e-308022903fd1-tmp-dir\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.785556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785363 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wktdg\"" Apr 22 18:20:38.785556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785372 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:20:38.785556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785376 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:20:38.785556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785446 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:20:38.785556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:20:38.785556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785534 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:20:38.785996 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.785679 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:20:38.818024 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.817997 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:37 +0000 UTC" deadline="2027-10-26 07:33:06.730253478 +0000 UTC" Apr 22 18:20:38.818024 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.818023 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13237h12m27.912233368s" Apr 22 18:20:38.874365 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.874342 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:20:38.884062 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-slash\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884168 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-run-netns\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884168 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-var-lib-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884168 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-log-socket\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884168 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-cni-bin\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysctl-conf\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-ovnkube-config\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-etc-selinux\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d36f1e9-9321-4fbf-935b-023f00adbb68-agent-certs\") pod \"konnectivity-agent-fklgp\" (UID: \"7d36f1e9-9321-4fbf-935b-023f00adbb68\") " pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f219549d-a221-4934-9036-877b25fa0d00-host-slash\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-etc-selinux\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysctl-conf\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb638d9e-ea2e-4a2e-979e-308022903fd1-hosts-file\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb638d9e-ea2e-4a2e-979e-308022903fd1-tmp-dir\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-systemd\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-cni-binary-copy\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-systemd\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fb638d9e-ea2e-4a2e-979e-308022903fd1-hosts-file\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-env-overrides\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-ovnkube-script-lib\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-cnibin\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-conf-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884665 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-sys\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-conf-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-cnibin\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.884804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb638d9e-ea2e-4a2e-979e-308022903fd1-tmp-dir\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-sys\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a492888-e464-4422-ac7a-f260e0cd42aa-tmp\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/874d96b9-6b68-4589-95c9-72b4b398e980-ovn-node-metrics-cert\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-etc-kubernetes\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-run\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-var-lib-kubelet\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-etc-kubernetes\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8g85\" (UniqueName: \"kubernetes.io/projected/874d96b9-6b68-4589-95c9-72b4b398e980-kube-api-access-d8g85\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-run\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.884998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-system-cni-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-var-lib-kubelet\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-cni-bin\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-system-cni-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-daemon-config\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.885638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-multus-certs\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-cni-bin\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj5cv\" (UniqueName: \"kubernetes.io/projected/f219549d-a221-4934-9036-877b25fa0d00-kube-api-access-mj5cv\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-ovn\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-multus-certs\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-cni-netd\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-sys-fs\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zcv\" (UniqueName: \"kubernetes.io/projected/8d08824f-8af6-46ef-b54e-409f85817ae0-kube-api-access-q7zcv\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-cni-binary-copy\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885366 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-sys-fs\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-hostroot\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/073609fd-8186-41f7-860d-4fd136656e3f-serviceca\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f219549d-a221-4934-9036-877b25fa0d00-iptables-alerter-script\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-system-cni-dir\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-os-release\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-registration-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-cni-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.886441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-daemon-config\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzcqb\" (UniqueName: \"kubernetes.io/projected/073609fd-8186-41f7-860d-4fd136656e3f-kube-api-access-rzcqb\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-lib-modules\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-etc-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-cni-multus\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-cni-dir\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-node-log\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-hostroot\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-modprobe-d\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-device-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrbwg\" (UniqueName: \"kubernetes.io/projected/1f085cfa-07bb-457b-85ce-79f190f3ecb1-kube-api-access-jrbwg\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysconfig\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/073609fd-8186-41f7-860d-4fd136656e3f-serviceca\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-systemd\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.885970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-modprobe-d\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-device-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-cni-binary-copy\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-socket-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-registration-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysconfig\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-os-release\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-cni-multus\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886206 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-netns\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-kubelet\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-os-release\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d08824f-8af6-46ef-b54e-409f85817ae0-socket-dir\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-var-lib-kubelet\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-netns\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6fx\" (UniqueName: \"kubernetes.io/projected/fb638d9e-ea2e-4a2e-979e-308022903fd1-kube-api-access-fl6fx\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-lib-modules\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-host\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.886403 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-k8s-cni-cncf-io\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-host\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.887904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/073609fd-8186-41f7-860d-4fd136656e3f-host\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.886466 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:39.386438738 +0000 UTC m=+3.095784370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-host-run-k8s-cni-cncf-io\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/073609fd-8186-41f7-860d-4fd136656e3f-host\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-kubernetes\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysctl-d\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xvr\" (UniqueName: \"kubernetes.io/projected/1a492888-e464-4422-ac7a-f260e0cd42aa-kube-api-access-v6xvr\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-kubernetes\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-kubelet\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-socket-dir-parent\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-sysctl-d\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lsrn\" (UniqueName: \"kubernetes.io/projected/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-kube-api-access-8lsrn\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-tuned\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-multus-socket-dir-parent\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.888533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-cnibin\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.889080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-systemd-units\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.889080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d36f1e9-9321-4fbf-935b-023f00adbb68-konnectivity-ca\") pod \"konnectivity-agent-fklgp\" (UID: \"7d36f1e9-9321-4fbf-935b-023f00adbb68\") " pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.889080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.886840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgx6w\" (UniqueName: \"kubernetes.io/projected/a9f82cba-e68b-4160-a7cd-232b0875487d-kube-api-access-kgx6w\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.889080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.887287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d36f1e9-9321-4fbf-935b-023f00adbb68-konnectivity-ca\") pod \"konnectivity-agent-fklgp\" (UID: \"7d36f1e9-9321-4fbf-935b-023f00adbb68\") " pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.889080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.887500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a492888-e464-4422-ac7a-f260e0cd42aa-tmp\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.889080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.887583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d36f1e9-9321-4fbf-935b-023f00adbb68-agent-certs\") pod \"konnectivity-agent-fklgp\" (UID: \"7d36f1e9-9321-4fbf-935b-023f00adbb68\") " pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:38.889310 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.889114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a492888-e464-4422-ac7a-f260e0cd42aa-etc-tuned\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.893638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.893616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zcv\" (UniqueName: \"kubernetes.io/projected/8d08824f-8af6-46ef-b54e-409f85817ae0-kube-api-access-q7zcv\") pod \"aws-ebs-csi-driver-node-bzgjb\" (UID: \"8d08824f-8af6-46ef-b54e-409f85817ae0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:38.898606 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.898405 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:38.898606 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.898456 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:38.898606 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.898468 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:38.898606 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:38.898566 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:20:39.39855042 +0000 UTC m=+3.107896063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:38.900459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.900434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6fx\" (UniqueName: \"kubernetes.io/projected/fb638d9e-ea2e-4a2e-979e-308022903fd1-kube-api-access-fl6fx\") pod \"node-resolver-xxh8b\" (UID: \"fb638d9e-ea2e-4a2e-979e-308022903fd1\") " pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:38.900742 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.900718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzcqb\" (UniqueName: \"kubernetes.io/projected/073609fd-8186-41f7-860d-4fd136656e3f-kube-api-access-rzcqb\") pod \"node-ca-4nkxw\" (UID: \"073609fd-8186-41f7-860d-4fd136656e3f\") " pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:38.900952 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.900927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lsrn\" (UniqueName: \"kubernetes.io/projected/2a55ce37-0ff0-47e0-92a1-9f75d384f77e-kube-api-access-8lsrn\") pod \"multus-slx9s\" (UID: \"2a55ce37-0ff0-47e0-92a1-9f75d384f77e\") " pod="openshift-multus/multus-slx9s" Apr 22 18:20:38.901334 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.901309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xvr\" (UniqueName: \"kubernetes.io/projected/1a492888-e464-4422-ac7a-f260e0cd42aa-kube-api-access-v6xvr\") pod \"tuned-5r97b\" (UID: \"1a492888-e464-4422-ac7a-f260e0cd42aa\") " pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:38.901887 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.901868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrbwg\" (UniqueName: \"kubernetes.io/projected/1f085cfa-07bb-457b-85ce-79f190f3ecb1-kube-api-access-jrbwg\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:38.987362 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgx6w\" (UniqueName: \"kubernetes.io/projected/a9f82cba-e68b-4160-a7cd-232b0875487d-kube-api-access-kgx6w\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.987362 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-slash\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-run-netns\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-var-lib-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-log-socket\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-cni-bin\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-slash\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-run-netns\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-var-lib-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-cni-bin\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-ovnkube-config\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.987548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-run-ovn-kubernetes\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-log-socket\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f219549d-a221-4934-9036-877b25fa0d00-host-slash\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f219549d-a221-4934-9036-877b25fa0d00-host-slash\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-cni-binary-copy\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-env-overrides\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-ovnkube-script-lib\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/874d96b9-6b68-4589-95c9-72b4b398e980-ovn-node-metrics-cert\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8g85\" (UniqueName: \"kubernetes.io/projected/874d96b9-6b68-4589-95c9-72b4b398e980-kube-api-access-d8g85\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj5cv\" (UniqueName: \"kubernetes.io/projected/f219549d-a221-4934-9036-877b25fa0d00-kube-api-access-mj5cv\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-ovn\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-cni-netd\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f219549d-a221-4934-9036-877b25fa0d00-iptables-alerter-script\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-system-cni-dir\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.987984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-os-release\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-etc-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-node-log\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988088 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-cni-binary-copy\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988100 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-ovnkube-config\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-env-overrides\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-systemd\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-systemd\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-run-ovn\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-system-cni-dir\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-etc-openvswitch\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-node-log\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.988799 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-kubelet\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/874d96b9-6b68-4589-95c9-72b4b398e980-ovnkube-script-lib\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-os-release\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-cnibin\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-systemd-units\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-kubelet\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-host-cni-netd\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/874d96b9-6b68-4589-95c9-72b4b398e980-systemd-units\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a9f82cba-e68b-4160-a7cd-232b0875487d-cnibin\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.988837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f219549d-a221-4934-9036-877b25fa0d00-iptables-alerter-script\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.989459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.989444 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a9f82cba-e68b-4160-a7cd-232b0875487d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.990691 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.990666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/874d96b9-6b68-4589-95c9-72b4b398e980-ovn-node-metrics-cert\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:38.996735 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.996711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgx6w\" (UniqueName: \"kubernetes.io/projected/a9f82cba-e68b-4160-a7cd-232b0875487d-kube-api-access-kgx6w\") pod \"multus-additional-cni-plugins-s9sjx\" (UID: \"a9f82cba-e68b-4160-a7cd-232b0875487d\") " pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:38.997492 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.997468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj5cv\" (UniqueName: \"kubernetes.io/projected/f219549d-a221-4934-9036-877b25fa0d00-kube-api-access-mj5cv\") pod \"iptables-alerter-tjcwf\" (UID: \"f219549d-a221-4934-9036-877b25fa0d00\") " pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:38.998426 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:38.998380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8g85\" (UniqueName: \"kubernetes.io/projected/874d96b9-6b68-4589-95c9-72b4b398e980-kube-api-access-d8g85\") pod \"ovnkube-node-s8sh5\" (UID: \"874d96b9-6b68-4589-95c9-72b4b398e980\") " pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:39.071351 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.071325 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" Apr 22 18:20:39.079180 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.079159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:39.088591 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.088575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-slx9s" Apr 22 18:20:39.093056 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.093039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4nkxw" Apr 22 18:20:39.099688 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.099672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xxh8b" Apr 22 18:20:39.101404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.101391 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:39.105829 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.105813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5r97b" Apr 22 18:20:39.112332 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.112310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" Apr 22 18:20:39.119804 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.119781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tjcwf" Apr 22 18:20:39.125372 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.125354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:20:39.391672 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.391311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:39.391672 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.391459 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:39.391672 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.391541 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:40.391520178 +0000 UTC m=+4.100865808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:39.492291 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.492256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:39.492456 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.492379 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:39.492456 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.492399 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:39.492456 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.492412 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:39.492624 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.492468 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:20:40.492448935 +0000 UTC m=+4.201794571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:39.601533 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.601374 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb638d9e_ea2e_4a2e_979e_308022903fd1.slice/crio-cacac949085b6c2ad48b36e8c1501c23e8abe8eb37b61686455ed4cb1584619f WatchSource:0}: Error finding container cacac949085b6c2ad48b36e8c1501c23e8abe8eb37b61686455ed4cb1584619f: Status 404 returned error can't find the container with id cacac949085b6c2ad48b36e8c1501c23e8abe8eb37b61686455ed4cb1584619f Apr 22 18:20:39.603385 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.603358 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a492888_e464_4422_ac7a_f260e0cd42aa.slice/crio-8589f65e535031b5f625a782703736cb12af24807020779f8318b610411e6277 WatchSource:0}: Error finding container 8589f65e535031b5f625a782703736cb12af24807020779f8318b610411e6277: Status 404 returned error can't find the container with id 8589f65e535031b5f625a782703736cb12af24807020779f8318b610411e6277 Apr 22 18:20:39.606515 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.606495 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f82cba_e68b_4160_a7cd_232b0875487d.slice/crio-f1ca6e8df5a966e0fd140347092d2bd3e1f7a56768164b80acf2a101c242d518 WatchSource:0}: Error finding container f1ca6e8df5a966e0fd140347092d2bd3e1f7a56768164b80acf2a101c242d518: Status 404 returned error can't find the container with id f1ca6e8df5a966e0fd140347092d2bd3e1f7a56768164b80acf2a101c242d518 Apr 22 18:20:39.607481 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.607461 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a55ce37_0ff0_47e0_92a1_9f75d384f77e.slice/crio-fee4bc92bdc1c3b95dec8c42c30917d9df1e81c509d9d2079d0cff0bd13b5165 WatchSource:0}: Error finding container fee4bc92bdc1c3b95dec8c42c30917d9df1e81c509d9d2079d0cff0bd13b5165: Status 404 returned error can't find the container with id fee4bc92bdc1c3b95dec8c42c30917d9df1e81c509d9d2079d0cff0bd13b5165 Apr 22 18:20:39.608169 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.608061 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf219549d_a221_4934_9036_877b25fa0d00.slice/crio-5393b2490813af58c8060abdf8ac323b9726d7b74dc56d5a878d661880a811d4 WatchSource:0}: Error finding container 5393b2490813af58c8060abdf8ac323b9726d7b74dc56d5a878d661880a811d4: Status 404 returned error can't find the container with id 5393b2490813af58c8060abdf8ac323b9726d7b74dc56d5a878d661880a811d4 Apr 22 18:20:39.609043 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.609007 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d36f1e9_9321_4fbf_935b_023f00adbb68.slice/crio-6646fafa7ac744b07a0d8af69c6be4935bdf4645d2fa3bf05baa75ff71f4db17 WatchSource:0}: Error finding container 6646fafa7ac744b07a0d8af69c6be4935bdf4645d2fa3bf05baa75ff71f4db17: Status 404 returned error can't find the container with id 6646fafa7ac744b07a0d8af69c6be4935bdf4645d2fa3bf05baa75ff71f4db17 Apr 22 18:20:39.609758 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:20:39.609715 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d08824f_8af6_46ef_b54e_409f85817ae0.slice/crio-8fffa354db04fdf4d4bed53a2d5d76119a9ecd8221cc5aa611eb8cb90b8bd701 WatchSource:0}: Error finding container 8fffa354db04fdf4d4bed53a2d5d76119a9ecd8221cc5aa611eb8cb90b8bd701: Status 404 returned error can't find the container with id 8fffa354db04fdf4d4bed53a2d5d76119a9ecd8221cc5aa611eb8cb90b8bd701 Apr 22 18:20:39.818429 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.818387 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:37 +0000 UTC" deadline="2027-11-30 01:42:54.503075148 +0000 UTC" Apr 22 18:20:39.818429 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.818418 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14071h22m14.684660819s" Apr 22 18:20:39.905997 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.905968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:39.906125 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:39.906097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:39.913049 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.912975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xxh8b" event={"ID":"fb638d9e-ea2e-4a2e-979e-308022903fd1","Type":"ContainerStarted","Data":"cacac949085b6c2ad48b36e8c1501c23e8abe8eb37b61686455ed4cb1584619f"} Apr 22 18:20:39.914088 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.914064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"958b05e06f76b8724b22d3ca270b84603e27ce9329e9371cc47494141d1d7b27"} Apr 22 18:20:39.915015 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.914990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4nkxw" event={"ID":"073609fd-8186-41f7-860d-4fd136656e3f","Type":"ContainerStarted","Data":"2c7ebbbacf3af76f62f6cb9c9e0605d0c2a4f8288a96c1575c12935d0329be13"} Apr 22 18:20:39.915911 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.915884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" event={"ID":"8d08824f-8af6-46ef-b54e-409f85817ae0","Type":"ContainerStarted","Data":"8fffa354db04fdf4d4bed53a2d5d76119a9ecd8221cc5aa611eb8cb90b8bd701"} Apr 22 18:20:39.916854 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.916828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tjcwf" event={"ID":"f219549d-a221-4934-9036-877b25fa0d00","Type":"ContainerStarted","Data":"5393b2490813af58c8060abdf8ac323b9726d7b74dc56d5a878d661880a811d4"} Apr 22 18:20:39.918215 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.918187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" event={"ID":"075634087ab8a39514ea4cf278614518","Type":"ContainerStarted","Data":"ef3d79a6f5eff67370dc51dba7659cecd55a4c79e64d8ea2c9a9a4cfdf14f880"} Apr 22 18:20:39.919055 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.919037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fklgp" event={"ID":"7d36f1e9-9321-4fbf-935b-023f00adbb68","Type":"ContainerStarted","Data":"6646fafa7ac744b07a0d8af69c6be4935bdf4645d2fa3bf05baa75ff71f4db17"} Apr 22 18:20:39.920778 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.920759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slx9s" event={"ID":"2a55ce37-0ff0-47e0-92a1-9f75d384f77e","Type":"ContainerStarted","Data":"fee4bc92bdc1c3b95dec8c42c30917d9df1e81c509d9d2079d0cff0bd13b5165"} Apr 22 18:20:39.921959 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.921932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerStarted","Data":"f1ca6e8df5a966e0fd140347092d2bd3e1f7a56768164b80acf2a101c242d518"} Apr 22 18:20:39.923380 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:39.923358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5r97b" event={"ID":"1a492888-e464-4422-ac7a-f260e0cd42aa","Type":"ContainerStarted","Data":"8589f65e535031b5f625a782703736cb12af24807020779f8318b610411e6277"} Apr 22 18:20:40.399539 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:40.399460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:40.399647 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.399629 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:40.399702 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.399696 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:42.399678225 +0000 UTC m=+6.109023868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:40.500837 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:40.500565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:40.500837 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.500714 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:40.500837 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.500731 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:40.500837 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.500745 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:40.500837 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.500802 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:20:42.500783806 +0000 UTC m=+6.210129441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:40.905977 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:40.905477 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:40.905977 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:40.905610 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:40.964088 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:40.964038 2575 generic.go:358] "Generic (PLEG): container finished" podID="9f6884c0e78773f3a19b1cc8342e7247" containerID="8ae138033400b3cafcb318e3cf3e42828fe3ea6a18b9633323927bea5b09e65d" exitCode=0 Apr 22 18:20:40.964800 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:40.964572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" event={"ID":"9f6884c0e78773f3a19b1cc8342e7247","Type":"ContainerDied","Data":"8ae138033400b3cafcb318e3cf3e42828fe3ea6a18b9633323927bea5b09e65d"} Apr 22 18:20:40.986104 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:40.984983 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-74.ec2.internal" podStartSLOduration=2.984964389 podStartE2EDuration="2.984964389s" podCreationTimestamp="2026-04-22 18:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:39.940306561 +0000 UTC m=+3.649652196" watchObservedRunningTime="2026-04-22 18:20:40.984964389 +0000 UTC m=+4.694310044" Apr 22 18:20:41.906069 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:41.906038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:41.906581 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:41.906168 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:41.975270 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:41.974995 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" event={"ID":"9f6884c0e78773f3a19b1cc8342e7247","Type":"ContainerStarted","Data":"9bf96c4055cdf34d2bba6a3aaed1b2c59cdda4de8aa260d37f1b6a79014bd852"} Apr 22 18:20:42.417811 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.417075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:42.417811 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.417320 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:42.417811 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.417380 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:46.417360549 +0000 UTC m=+10.126706186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:42.518255 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.518203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:42.518519 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.518384 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:42.518519 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.518408 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:42.518519 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.518420 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:42.518519 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.518488 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:20:46.518457813 +0000 UTC m=+10.227803448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:42.801391 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.801334 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-74.ec2.internal" podStartSLOduration=4.801313518 podStartE2EDuration="4.801313518s" podCreationTimestamp="2026-04-22 18:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:41.990182864 +0000 UTC m=+5.699528517" watchObservedRunningTime="2026-04-22 18:20:42.801313518 +0000 UTC m=+6.510659172" Apr 22 18:20:42.801809 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.801788 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xzh9f"] Apr 22 18:20:42.806541 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.806516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:42.806643 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.806596 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:42.907379 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.907339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:42.907817 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:42.907474 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:42.921575 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.921540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/39554817-37b5-4aee-afc9-ec4c204d3d1c-kubelet-config\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:42.921725 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.921624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/39554817-37b5-4aee-afc9-ec4c204d3d1c-dbus\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:42.921725 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:42.921664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.022425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.022484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/39554817-37b5-4aee-afc9-ec4c204d3d1c-kubelet-config\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.022549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/39554817-37b5-4aee-afc9-ec4c204d3d1c-dbus\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.022725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/39554817-37b5-4aee-afc9-ec4c204d3d1c-dbus\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:43.022828 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:43.022882 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret podName:39554817-37b5-4aee-afc9-ec4c204d3d1c nodeName:}" failed. No retries permitted until 2026-04-22 18:20:43.522864477 +0000 UTC m=+7.232210111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret") pod "global-pull-secret-syncer-xzh9f" (UID: "39554817-37b5-4aee-afc9-ec4c204d3d1c") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:43.023176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.023126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/39554817-37b5-4aee-afc9-ec4c204d3d1c-kubelet-config\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.525648 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.525610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:43.525840 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:43.525762 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:43.525840 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:43.525823 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret podName:39554817-37b5-4aee-afc9-ec4c204d3d1c nodeName:}" failed. No retries permitted until 2026-04-22 18:20:44.525805315 +0000 UTC m=+8.235150947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret") pod "global-pull-secret-syncer-xzh9f" (UID: "39554817-37b5-4aee-afc9-ec4c204d3d1c") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:43.906538 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:43.906028 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:43.906538 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:43.906183 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:44.534113 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:44.534074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:44.534574 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:44.534275 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:44.534574 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:44.534363 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret podName:39554817-37b5-4aee-afc9-ec4c204d3d1c nodeName:}" failed. No retries permitted until 2026-04-22 18:20:46.534343689 +0000 UTC m=+10.243689333 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret") pod "global-pull-secret-syncer-xzh9f" (UID: "39554817-37b5-4aee-afc9-ec4c204d3d1c") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:44.906411 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:44.906329 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:44.906553 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:44.906468 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:44.906553 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:44.906500 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:44.906645 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:44.906623 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:45.906396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:45.906358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:45.906811 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:45.906507 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:46.451654 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:46.451618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:46.451818 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.451796 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:46.452104 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.451871 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:54.451850488 +0000 UTC m=+18.161196126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:46.552316 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:46.552259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:46.552316 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:46.552339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:46.552549 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.552483 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:46.552549 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.552542 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret podName:39554817-37b5-4aee-afc9-ec4c204d3d1c nodeName:}" failed. No retries permitted until 2026-04-22 18:20:50.552524762 +0000 UTC m=+14.261870394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret") pod "global-pull-secret-syncer-xzh9f" (UID: "39554817-37b5-4aee-afc9-ec4c204d3d1c") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:46.553065 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.552952 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:46.553065 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.552973 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:46.553065 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.552986 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:46.553065 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.553036 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:20:54.553020499 +0000 UTC m=+18.262366133 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:46.906415 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:46.906381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:46.906415 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:46.906382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:46.907936 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.907465 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:46.907936 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:46.907880 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:47.906226 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:47.905727 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:47.906226 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:47.905873 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:48.906420 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:48.906380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:48.906805 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:48.906380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:48.906805 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:48.906526 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:48.906805 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:48.906554 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:49.906098 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:49.906067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:49.906291 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:49.906196 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:50.582320 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:50.582277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:50.582726 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:50.582443 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:50.582726 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:50.582517 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret podName:39554817-37b5-4aee-afc9-ec4c204d3d1c nodeName:}" failed. No retries permitted until 2026-04-22 18:20:58.582495542 +0000 UTC m=+22.291841190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret") pod "global-pull-secret-syncer-xzh9f" (UID: "39554817-37b5-4aee-afc9-ec4c204d3d1c") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:50.905494 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:50.905420 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:50.905494 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:50.905450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:50.905698 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:50.905537 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:50.905752 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:50.905692 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:51.905904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:51.905829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:51.906344 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:51.905934 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:52.906102 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:52.906046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:52.906551 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:52.906170 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:52.906551 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:52.906266 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:52.906551 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:52.906371 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:53.905892 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:53.905842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:53.906096 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:53.905996 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:54.512045 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:54.511999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:54.512522 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.512180 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:54.512522 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.512286 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:10.512263119 +0000 UTC m=+34.221608757 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:54.612900 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:54.612854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:54.613075 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.613051 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:54.613116 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.613082 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:54.613116 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.613097 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:54.613201 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.613167 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:21:10.613145739 +0000 UTC m=+34.322491385 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:54.905875 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:54.905772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:54.906037 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:54.905772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:54.906037 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.905918 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:54.906037 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:54.905995 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:55.905665 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:55.905627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:55.906156 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:55.905739 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:56.907067 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:56.906803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:56.907816 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:56.906863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:56.907816 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:56.907157 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:56.907816 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:56.907228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:57.002847 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.002822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:20:57.003176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.003155 2575 generic.go:358] "Generic (PLEG): container finished" podID="874d96b9-6b68-4589-95c9-72b4b398e980" containerID="a55bfff79f59c6693db7389e1a9a7bf284ae7f9f307433248321c5e0046aabbf" exitCode=1 Apr 22 18:20:57.003260 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.003225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerDied","Data":"a55bfff79f59c6693db7389e1a9a7bf284ae7f9f307433248321c5e0046aabbf"} Apr 22 18:20:57.003322 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.003280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"8369d33cc015fc0b1e5eed16c10c54d997e04613e3792c8773e148eaf5de48c0"} Apr 22 18:20:57.004631 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.004601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4nkxw" event={"ID":"073609fd-8186-41f7-860d-4fd136656e3f","Type":"ContainerStarted","Data":"c4abfe7698be445ea056f4e5669e6a8fddc4fb2bbfb24c54817ce4ae51f9c9e2"} Apr 22 18:20:57.005878 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.005858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" event={"ID":"8d08824f-8af6-46ef-b54e-409f85817ae0","Type":"ContainerStarted","Data":"199269add7a71102668168dfa166095078b2be013eda67b679775f5ba6dd2f67"} Apr 22 18:20:57.007033 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.007015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fklgp" event={"ID":"7d36f1e9-9321-4fbf-935b-023f00adbb68","Type":"ContainerStarted","Data":"326d1e000ab550dbcd64c6a90e6938a5cb1e8e360154a9f7c0a5fc578122bd46"} Apr 22 18:20:57.008404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.008371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slx9s" event={"ID":"2a55ce37-0ff0-47e0-92a1-9f75d384f77e","Type":"ContainerStarted","Data":"aef6982544ba20c764797dc8be70fda7bec8a6d633b2d2425ace8ad84ad5247b"} Apr 22 18:20:57.009564 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.009543 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9f82cba-e68b-4160-a7cd-232b0875487d" containerID="645b8c6bc5f547402c15b7238422f26e26029028419a9187b65550b57e55d6d7" exitCode=0 Apr 22 18:20:57.009644 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.009603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerDied","Data":"645b8c6bc5f547402c15b7238422f26e26029028419a9187b65550b57e55d6d7"} Apr 22 18:20:57.011090 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.011060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5r97b" event={"ID":"1a492888-e464-4422-ac7a-f260e0cd42aa","Type":"ContainerStarted","Data":"e57e9b1927483dade12cb9e468becb5204420d4b7320422e9b0fce15a24581c8"} Apr 22 18:20:57.012562 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.012537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xxh8b" event={"ID":"fb638d9e-ea2e-4a2e-979e-308022903fd1","Type":"ContainerStarted","Data":"ec5ace827263fca9e203c13707bc80a288b974d5d6bce7c026bd1f0fbc5c8f4c"} Apr 22 18:20:57.023153 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.023106 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4nkxw" podStartSLOduration=3.302251124 podStartE2EDuration="20.023094509s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.61345706 +0000 UTC m=+3.322802692" lastFinishedPulling="2026-04-22 18:20:56.334300447 +0000 UTC m=+20.043646077" observedRunningTime="2026-04-22 18:20:57.023003244 +0000 UTC m=+20.732348904" watchObservedRunningTime="2026-04-22 18:20:57.023094509 +0000 UTC m=+20.732440161" Apr 22 18:20:57.090439 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.090387 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fklgp" podStartSLOduration=4.362155833 podStartE2EDuration="21.090370656s" podCreationTimestamp="2026-04-22 18:20:36 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.610956507 +0000 UTC m=+3.320302154" lastFinishedPulling="2026-04-22 18:20:56.339171332 +0000 UTC m=+20.048516977" observedRunningTime="2026-04-22 18:20:57.071751305 +0000 UTC m=+20.781096957" watchObservedRunningTime="2026-04-22 18:20:57.090370656 +0000 UTC m=+20.799716311" Apr 22 18:20:57.090861 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.090835 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xxh8b" podStartSLOduration=3.359154271 podStartE2EDuration="20.09082852s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.602796835 +0000 UTC m=+3.312142466" lastFinishedPulling="2026-04-22 18:20:56.33447107 +0000 UTC m=+20.043816715" observedRunningTime="2026-04-22 18:20:57.090596338 +0000 UTC m=+20.799942004" watchObservedRunningTime="2026-04-22 18:20:57.09082852 +0000 UTC m=+20.800174172" Apr 22 18:20:57.135641 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.135603 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5r97b" podStartSLOduration=3.406785083 podStartE2EDuration="20.135592053s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.605490686 +0000 UTC m=+3.314836333" lastFinishedPulling="2026-04-22 18:20:56.334297658 +0000 UTC m=+20.043643303" observedRunningTime="2026-04-22 18:20:57.108584 +0000 UTC m=+20.817929650" watchObservedRunningTime="2026-04-22 18:20:57.135592053 +0000 UTC m=+20.844937705" Apr 22 18:20:57.771783 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.771745 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:57.772478 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.772449 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:57.786199 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.786144 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-slx9s" podStartSLOduration=3.847593052 podStartE2EDuration="20.786127822s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.609532333 +0000 UTC m=+3.318877968" lastFinishedPulling="2026-04-22 18:20:56.54806709 +0000 UTC m=+20.257412738" observedRunningTime="2026-04-22 18:20:57.135475794 +0000 UTC m=+20.844821438" watchObservedRunningTime="2026-04-22 18:20:57.786127822 +0000 UTC m=+21.495473475" Apr 22 18:20:57.905552 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:57.905519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:57.905723 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:57.905654 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:20:58.018414 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.018108 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:20:58.019087 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.019055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"f01b1c028cddb3f6add5aad16056ff2fb59b6deca846b598ce85f56ab74a18af"} Apr 22 18:20:58.019179 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.019101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"2c2daacc0574da37de230ce99de2b96be6995af6dbeb96ae81de13716fc778c3"} Apr 22 18:20:58.019179 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.019117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"e181712898281afec100e9ec264ba68b7e165212db4e7e3677a78b572a59324c"} Apr 22 18:20:58.019179 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.019130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"3cec38c5864135be63d0203ce077dd7b6b7e63e9db703a32be2fcdbbe0eea755"} Apr 22 18:20:58.021769 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.021737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tjcwf" event={"ID":"f219549d-a221-4934-9036-877b25fa0d00","Type":"ContainerStarted","Data":"e6a45f49be4e7721109ecdf5fbf18b31a9026b7cb2a9dfeac63e8b85f46c54bc"} Apr 22 18:20:58.022633 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.022440 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:58.022935 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.022915 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fklgp" Apr 22 18:20:58.034820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.034784 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tjcwf" podStartSLOduration=8.66277156 podStartE2EDuration="21.034771459s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.610807314 +0000 UTC m=+3.320152960" lastFinishedPulling="2026-04-22 18:20:51.982807219 +0000 UTC m=+15.692152859" observedRunningTime="2026-04-22 18:20:58.034372604 +0000 UTC m=+21.743718259" watchObservedRunningTime="2026-04-22 18:20:58.034771459 +0000 UTC m=+21.744117111" Apr 22 18:20:58.124544 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.124504 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:20:58.648313 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.648273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:58.648618 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:58.648447 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:58.648618 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:58.648529 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret podName:39554817-37b5-4aee-afc9-ec4c204d3d1c nodeName:}" failed. No retries permitted until 2026-04-22 18:21:14.648509688 +0000 UTC m=+38.357855335 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret") pod "global-pull-secret-syncer-xzh9f" (UID: "39554817-37b5-4aee-afc9-ec4c204d3d1c") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:58.847114 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.846908 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:20:58.124528045Z","UUID":"916b048e-2203-4069-83ec-992814a54034","Handler":null,"Name":"","Endpoint":""} Apr 22 18:20:58.848726 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.848699 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:20:58.848726 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.848732 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:20:58.909504 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.909418 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:20:58.909504 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:58.909418 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:20:58.909702 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:58.909522 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:20:58.909702 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:58.909575 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:20:59.025413 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:59.025378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" event={"ID":"8d08824f-8af6-46ef-b54e-409f85817ae0","Type":"ContainerStarted","Data":"26b0db4f5273e22c0c2abffbb65bf6e607d99743fec7a6a6059d3dc2e47339f0"} Apr 22 18:20:59.905785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:20:59.905748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:20:59.905957 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:20:59.905870 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:21:00.030822 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:00.030790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:21:00.031477 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:00.031169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"c34db206cacc4f0745f247a001affad976442a081ff40f044524ab888690cba8"} Apr 22 18:21:00.033687 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:00.033644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" event={"ID":"8d08824f-8af6-46ef-b54e-409f85817ae0","Type":"ContainerStarted","Data":"a925348ad1d20bff1f04cd2989cdb49a499c0ce8cdb96962ce42ffee029f2d2b"} Apr 22 18:21:00.077987 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:00.077926 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bzgjb" podStartSLOduration=4.328080477 podStartE2EDuration="24.077910332s" podCreationTimestamp="2026-04-22 18:20:36 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.612262358 +0000 UTC m=+3.321607993" lastFinishedPulling="2026-04-22 18:20:59.362092195 +0000 UTC m=+23.071437848" observedRunningTime="2026-04-22 18:21:00.077574731 +0000 UTC m=+23.786920383" watchObservedRunningTime="2026-04-22 18:21:00.077910332 +0000 UTC m=+23.787255984" Apr 22 18:21:00.905986 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:00.905949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:00.906164 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:00.905964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:00.906164 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:00.906076 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:21:00.906164 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:00.906130 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:21:01.906222 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:01.905941 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:01.906838 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:01.906240 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:21:02.039430 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.039401 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:21:02.039736 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.039713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"968b06d9043d7120f03548fe34e84e433732e29677a8a208955222f9f8b458e1"} Apr 22 18:21:02.040022 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.039990 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:21:02.040198 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.040172 2575 scope.go:117] "RemoveContainer" containerID="a55bfff79f59c6693db7389e1a9a7bf284ae7f9f307433248321c5e0046aabbf" Apr 22 18:21:02.043509 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.040947 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:21:02.043509 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.041096 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:21:02.045074 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.045048 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9f82cba-e68b-4160-a7cd-232b0875487d" containerID="4ea63ee9c4b3a0d1fe31f0c56a671badb8cf7bdd4aae85d2a2c2b82a331ca358" exitCode=0 Apr 22 18:21:02.045160 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.045084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerDied","Data":"4ea63ee9c4b3a0d1fe31f0c56a671badb8cf7bdd4aae85d2a2c2b82a331ca358"} Apr 22 18:21:02.058419 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.058397 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:21:02.058911 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.058893 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:21:02.906055 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.906022 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:02.906207 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:02.906037 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:02.906207 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:02.906127 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:21:02.906582 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:02.906261 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:21:03.051888 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.051859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:21:03.052555 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.052346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" event={"ID":"874d96b9-6b68-4589-95c9-72b4b398e980","Type":"ContainerStarted","Data":"685dd622ad5f12f286648ada5d929fcaa3fb2e4b1cab7b7c10997f19e2a3a0ee"} Apr 22 18:21:03.055007 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.054663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerStarted","Data":"5774065aa15b59b0b381b5efb338d98fba89ed6700b88afd24ae9a1aa0f9173c"} Apr 22 18:21:03.089403 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.089338 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" podStartSLOduration=9.020165205 podStartE2EDuration="26.089319055s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.614838251 +0000 UTC m=+3.324183882" lastFinishedPulling="2026-04-22 18:20:56.683992102 +0000 UTC m=+20.393337732" observedRunningTime="2026-04-22 18:21:03.086853039 +0000 UTC m=+26.796198691" watchObservedRunningTime="2026-04-22 18:21:03.089319055 +0000 UTC m=+26.798664710" Apr 22 18:21:03.516624 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.516591 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xzh9f"] Apr 22 18:21:03.516778 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.516723 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:03.516870 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:03.516850 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:21:03.520773 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.520745 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dhwbm"] Apr 22 18:21:03.520937 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.520868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:03.520997 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:03.520948 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:21:03.524192 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.524165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tx2k7"] Apr 22 18:21:03.524336 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:03.524297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:03.524409 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:03.524390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:21:04.058679 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:04.058647 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9f82cba-e68b-4160-a7cd-232b0875487d" containerID="5774065aa15b59b0b381b5efb338d98fba89ed6700b88afd24ae9a1aa0f9173c" exitCode=0 Apr 22 18:21:04.059096 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:04.058732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerDied","Data":"5774065aa15b59b0b381b5efb338d98fba89ed6700b88afd24ae9a1aa0f9173c"} Apr 22 18:21:04.906604 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:04.906395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:04.906701 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:04.906397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:04.906701 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:04.906644 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:21:04.906787 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:04.906712 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:21:04.906787 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:04.906395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:04.906787 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:04.906780 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:21:05.063084 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:05.063051 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9f82cba-e68b-4160-a7cd-232b0875487d" containerID="4e0e3439a6ebb7619c6ca5649aceb2d29da98f6c0bdbb3d60af3fa6c5191a1b9" exitCode=0 Apr 22 18:21:05.063465 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:05.063127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerDied","Data":"4e0e3439a6ebb7619c6ca5649aceb2d29da98f6c0bdbb3d60af3fa6c5191a1b9"} Apr 22 18:21:06.906949 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:06.906913 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:06.907697 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:06.907002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:06.907697 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:06.907035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:06.907697 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:06.907069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:21:06.907697 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:06.907107 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:21:06.907697 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:06.907200 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:21:08.905586 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:08.905549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:08.906136 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:08.905549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:08.906136 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:08.905682 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xzh9f" podUID="39554817-37b5-4aee-afc9-ec4c204d3d1c" Apr 22 18:21:08.906136 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:08.905749 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tx2k7" podUID="01e67bdc-8f43-4b2e-8cef-8d84eb59aabd" Apr 22 18:21:08.906136 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:08.905549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:08.906136 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:08.905852 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:21:09.568867 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.568838 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-74.ec2.internal" event="NodeReady" Apr 22 18:21:09.569095 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.568990 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:21:09.608457 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.608375 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-646b958546-7n687"] Apr 22 18:21:09.633598 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.633555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-646b958546-7n687"] Apr 22 18:21:09.633775 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.633744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.636566 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.636539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:21:09.636728 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.636705 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r4mpc\"" Apr 22 18:21:09.637088 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.636892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:21:09.637237 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.637219 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:21:09.643723 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.643702 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6wjzz"] Apr 22 18:21:09.651311 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.651287 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:21:09.660778 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.660750 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4kzvf"] Apr 22 18:21:09.660956 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.660939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.663669 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.663647 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:21:09.663837 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.663818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d4f9\"" Apr 22 18:21:09.663922 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.663905 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:21:09.688551 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.688516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4kzvf"] Apr 22 18:21:09.688551 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.688550 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6wjzz"] Apr 22 18:21:09.688792 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.688637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:09.691469 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.691443 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:21:09.691469 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.691449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pkrg\"" Apr 22 18:21:09.691647 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.691504 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:21:09.691647 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.691449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:21:09.725850 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.725812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1add8823-8c00-46ae-a8af-828b95cc217f-ca-trust-extracted\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.725850 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.725855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.726106 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.725915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/047091e6-d56c-4d8b-8391-f6285a93c154-tmp-dir\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.726106 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.725947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726106 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.725975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-trusted-ca\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726106 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-registry-certificates\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726106 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-bound-sa-token\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726106 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/047091e6-d56c-4d8b-8391-f6285a93c154-config-volume\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.726402 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-installation-pull-secrets\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726402 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-image-registry-private-configuration\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726402 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpg9l\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-kube-api-access-rpg9l\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.726402 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.726296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc6ww\" (UniqueName: \"kubernetes.io/projected/047091e6-d56c-4d8b-8391-f6285a93c154-kube-api-access-wc6ww\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.827425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-image-registry-private-configuration\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.827425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpg9l\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-kube-api-access-rpg9l\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.827425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc6ww\" (UniqueName: \"kubernetes.io/projected/047091e6-d56c-4d8b-8391-f6285a93c154-kube-api-access-wc6ww\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.827724 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1add8823-8c00-46ae-a8af-828b95cc217f-ca-trust-extracted\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.827724 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.827828 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/047091e6-d56c-4d8b-8391-f6285a93c154-tmp-dir\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.827828 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.827932 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-trusted-ca\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.827932 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzh86\" (UniqueName: \"kubernetes.io/projected/fa8d3113-51fb-4375-ab5a-40c379dabdaa-kube-api-access-zzh86\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:09.827932 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-registry-certificates\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.828071 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-bound-sa-token\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.828071 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/047091e6-d56c-4d8b-8391-f6285a93c154-config-volume\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.828071 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.827991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-installation-pull-secrets\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.828071 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.828019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:09.828270 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.828179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1add8823-8c00-46ae-a8af-828b95cc217f-ca-trust-extracted\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.828336 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.828305 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:09.828393 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.828368 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:10.328349116 +0000 UTC m=+34.037694762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:09.828455 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.828401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/047091e6-d56c-4d8b-8391-f6285a93c154-tmp-dir\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.828687 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.828664 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:09.828687 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.828685 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:09.828851 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.828728 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:10.328715477 +0000 UTC m=+34.038061124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:09.829165 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.829115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/047091e6-d56c-4d8b-8391-f6285a93c154-config-volume\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.829430 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.829408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-trusted-ca\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.832474 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.832442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-image-registry-private-configuration\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.832474 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.832455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-installation-pull-secrets\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.836759 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.836708 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc6ww\" (UniqueName: \"kubernetes.io/projected/047091e6-d56c-4d8b-8391-f6285a93c154-kube-api-access-wc6ww\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:09.837132 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.837104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-registry-certificates\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.837436 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.837383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpg9l\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-kube-api-access-rpg9l\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.837964 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.837923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-bound-sa-token\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:09.929215 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.929175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:09.929910 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.929352 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:09.929910 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:09.929416 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:21:10.429397755 +0000 UTC m=+34.138743410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:09.929910 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.929356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzh86\" (UniqueName: \"kubernetes.io/projected/fa8d3113-51fb-4375-ab5a-40c379dabdaa-kube-api-access-zzh86\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:09.941535 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:09.941504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzh86\" (UniqueName: \"kubernetes.io/projected/fa8d3113-51fb-4375-ab5a-40c379dabdaa-kube-api-access-zzh86\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:10.333113 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.332850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:10.333387 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.333024 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:10.333387 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.333230 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:10.333387 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.333308 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:10.333387 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.333313 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:11.333293573 +0000 UTC m=+35.042639220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:10.333387 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.333205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:10.333387 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.333367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:11.333352271 +0000 UTC m=+35.042697902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:10.434167 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.434130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:10.434430 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.434342 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:10.434430 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.434422 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:21:11.434400601 +0000 UTC m=+35.143746234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:10.535137 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.535098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:10.535328 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.535274 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:10.535384 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.535337 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:42.535322917 +0000 UTC m=+66.244668550 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:10.636371 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.636268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:10.636573 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.636442 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:10.636573 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.636458 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:10.636573 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.636468 2575 projected.go:194] Error preparing data for projected volume kube-api-access-tndgf for pod openshift-network-diagnostics/network-check-target-tx2k7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:10.636573 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:10.636531 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf podName:01e67bdc-8f43-4b2e-8cef-8d84eb59aabd nodeName:}" failed. No retries permitted until 2026-04-22 18:21:42.636514819 +0000 UTC m=+66.345860465 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tndgf" (UniqueName: "kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf") pod "network-check-target-tx2k7" (UID: "01e67bdc-8f43-4b2e-8cef-8d84eb59aabd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:10.906296 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.906165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:10.906296 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.906165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:10.906609 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.906165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:10.909224 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.909194 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fll8t\"" Apr 22 18:21:10.909374 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.909281 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:10.910210 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.910192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:21:10.910321 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.910265 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:10.910321 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.910293 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:10.910434 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:10.910354 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cm497\"" Apr 22 18:21:11.342090 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:11.342055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:11.342471 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:11.342117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:11.342471 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.342229 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:11.342471 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.342265 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:11.342471 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.342299 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:11.342471 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.342320 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:13.342304624 +0000 UTC m=+37.051650255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:11.342471 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.342346 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:13.342332398 +0000 UTC m=+37.051678034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:11.443032 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:11.442993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:11.443190 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.443137 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:11.443234 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:11.443201 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:21:13.4431863 +0000 UTC m=+37.152531934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:12.079325 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:12.079292 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9f82cba-e68b-4160-a7cd-232b0875487d" containerID="cc0e560f49067525cdd30673d475d2a260237708ba1a02577e16491119cbab0b" exitCode=0 Apr 22 18:21:12.079498 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:12.079348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerDied","Data":"cc0e560f49067525cdd30673d475d2a260237708ba1a02577e16491119cbab0b"} Apr 22 18:21:13.085780 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:13.085739 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9f82cba-e68b-4160-a7cd-232b0875487d" containerID="2b2a9d0308f8f5648bcdfec31bbb715f77e6c459410d1d5bcde62bdbb7d57842" exitCode=0 Apr 22 18:21:13.086152 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:13.085813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerDied","Data":"2b2a9d0308f8f5648bcdfec31bbb715f77e6c459410d1d5bcde62bdbb7d57842"} Apr 22 18:21:13.360864 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:13.360781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:13.360864 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:13.360832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:13.361049 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.360919 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:13.361049 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.360933 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:13.361049 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.360944 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:13.361049 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.360981 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.360964707 +0000 UTC m=+41.070310338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:13.361049 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.360996 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.360988649 +0000 UTC m=+41.070334279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:13.462031 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:13.461991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:13.462167 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.462133 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:13.462204 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:13.462198 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.462180809 +0000 UTC m=+41.171526439 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:14.090582 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:14.090543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" event={"ID":"a9f82cba-e68b-4160-a7cd-232b0875487d","Type":"ContainerStarted","Data":"572fb175f5f54e52b1f74a969ecee180935bdb392c134268c40d619913233e29"} Apr 22 18:21:14.125917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:14.125859 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s9sjx" podStartSLOduration=5.619562791 podStartE2EDuration="37.125842206s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:20:39.608442573 +0000 UTC m=+3.317788215" lastFinishedPulling="2026-04-22 18:21:11.114722 +0000 UTC m=+34.824067630" observedRunningTime="2026-04-22 18:21:14.125444559 +0000 UTC m=+37.834790212" watchObservedRunningTime="2026-04-22 18:21:14.125842206 +0000 UTC m=+37.835187861" Apr 22 18:21:14.670809 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:14.670772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:14.674696 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:14.674672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39554817-37b5-4aee-afc9-ec4c204d3d1c-original-pull-secret\") pod \"global-pull-secret-syncer-xzh9f\" (UID: \"39554817-37b5-4aee-afc9-ec4c204d3d1c\") " pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:14.827020 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:14.826981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xzh9f" Apr 22 18:21:14.988198 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:14.988163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xzh9f"] Apr 22 18:21:14.991386 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:21:14.991361 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39554817_37b5_4aee_afc9_ec4c204d3d1c.slice/crio-e65fc9acb0226932899172c0732fc2e4427200a780470a8bf440196e02b7dd82 WatchSource:0}: Error finding container e65fc9acb0226932899172c0732fc2e4427200a780470a8bf440196e02b7dd82: Status 404 returned error can't find the container with id e65fc9acb0226932899172c0732fc2e4427200a780470a8bf440196e02b7dd82 Apr 22 18:21:15.093859 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:15.093822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xzh9f" event={"ID":"39554817-37b5-4aee-afc9-ec4c204d3d1c","Type":"ContainerStarted","Data":"e65fc9acb0226932899172c0732fc2e4427200a780470a8bf440196e02b7dd82"} Apr 22 18:21:17.394779 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:17.394744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:17.395232 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:17.394804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:17.395232 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.394917 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:17.395232 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.394942 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:17.395232 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.394959 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:17.395232 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.395002 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:25.394979656 +0000 UTC m=+49.104325301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:17.395232 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.395086 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:25.395050302 +0000 UTC m=+49.104395949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:17.496278 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:17.496229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:17.496447 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.496415 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:17.496510 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:17.496501 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:21:25.49647885 +0000 UTC m=+49.205824507 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:19.103867 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:19.103832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xzh9f" event={"ID":"39554817-37b5-4aee-afc9-ec4c204d3d1c","Type":"ContainerStarted","Data":"eda012c62af58a63e6d1a3b4ff7f8adc1403b565a8c2e4163662cf249d239498"} Apr 22 18:21:19.121101 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:19.121053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xzh9f" podStartSLOduration=33.500415999 podStartE2EDuration="37.121039792s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:21:14.993227264 +0000 UTC m=+38.702572899" lastFinishedPulling="2026-04-22 18:21:18.613851061 +0000 UTC m=+42.323196692" observedRunningTime="2026-04-22 18:21:19.120370568 +0000 UTC m=+42.829716221" watchObservedRunningTime="2026-04-22 18:21:19.121039792 +0000 UTC m=+42.830385445" Apr 22 18:21:25.454444 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:25.454403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:25.454444 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:25.454452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:25.454925 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.454566 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:25.454925 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.454651 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:41.454632573 +0000 UTC m=+65.163978204 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:25.454925 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.454575 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:25.454925 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.454674 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:25.454925 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.454755 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:41.454741724 +0000 UTC m=+65.164087355 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:25.555018 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:25.554982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:25.555175 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.555134 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:25.555219 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:25.555185 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:21:41.555170436 +0000 UTC m=+65.264516079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:34.070743 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:34.070709 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s8sh5" Apr 22 18:21:41.456507 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:41.456465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:21:41.456886 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:41.456547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:21:41.456886 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.456619 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:41.456886 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.456635 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:41.456886 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.456639 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:21:41.456886 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.456685 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:13.456672227 +0000 UTC m=+97.166017858 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:21:41.456886 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.456700 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:22:13.456692967 +0000 UTC m=+97.166038598 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:21:41.557721 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:41.557697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:21:41.557865 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.557799 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:41.557904 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:41.557887 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:22:13.557861943 +0000 UTC m=+97.267207574 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:21:42.566311 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.566269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:21:42.569233 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.569214 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:42.576754 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:42.576736 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:21:42.576846 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:21:42.576808 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:46.576787466 +0000 UTC m=+130.286133097 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : secret "metrics-daemon-secret" not found Apr 22 18:21:42.667190 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.667154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:42.670017 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.669999 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:42.680940 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.680919 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:42.691779 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.691758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tndgf\" (UniqueName: \"kubernetes.io/projected/01e67bdc-8f43-4b2e-8cef-8d84eb59aabd-kube-api-access-tndgf\") pod \"network-check-target-tx2k7\" (UID: \"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd\") " pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:42.724746 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.724718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cm497\"" Apr 22 18:21:42.732087 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.732061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:21:42.842386 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:42.842319 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tx2k7"] Apr 22 18:21:42.845646 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:21:42.845619 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e67bdc_8f43_4b2e_8cef_8d84eb59aabd.slice/crio-640983ace67bf522bfe16db82cf1d347e28c8b444e3801836eca1497049b508b WatchSource:0}: Error finding container 640983ace67bf522bfe16db82cf1d347e28c8b444e3801836eca1497049b508b: Status 404 returned error can't find the container with id 640983ace67bf522bfe16db82cf1d347e28c8b444e3801836eca1497049b508b Apr 22 18:21:43.150671 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:43.150591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tx2k7" event={"ID":"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd","Type":"ContainerStarted","Data":"640983ace67bf522bfe16db82cf1d347e28c8b444e3801836eca1497049b508b"} Apr 22 18:21:46.158597 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:46.158557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tx2k7" event={"ID":"01e67bdc-8f43-4b2e-8cef-8d84eb59aabd","Type":"ContainerStarted","Data":"686aff12ccd629573b8ba6b10f7585ab0d0c1b5a435a41f0f7be1048c3d330de"} Apr 22 18:21:46.159004 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:21:46.158682 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:22:13.473717 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:13.473620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:22:13.473717 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:13.473684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:22:13.474163 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.473778 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:22:13.474163 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.473852 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:22:13.474163 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.473872 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-646b958546-7n687: secret "image-registry-tls" not found Apr 22 18:22:13.474163 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.473885 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls podName:047091e6-d56c-4d8b-8391-f6285a93c154 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:17.47386472 +0000 UTC m=+161.183210358 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls") pod "dns-default-6wjzz" (UID: "047091e6-d56c-4d8b-8391-f6285a93c154") : secret "dns-default-metrics-tls" not found Apr 22 18:22:13.474163 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.473955 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls podName:1add8823-8c00-46ae-a8af-828b95cc217f nodeName:}" failed. No retries permitted until 2026-04-22 18:23:17.473937116 +0000 UTC m=+161.183282747 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls") pod "image-registry-646b958546-7n687" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f") : secret "image-registry-tls" not found Apr 22 18:22:13.574097 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:13.574068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:22:13.574230 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.574167 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:22:13.574230 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:13.574218 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert podName:fa8d3113-51fb-4375-ab5a-40c379dabdaa nodeName:}" failed. No retries permitted until 2026-04-22 18:23:17.574206834 +0000 UTC m=+161.283552465 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert") pod "ingress-canary-4kzvf" (UID: "fa8d3113-51fb-4375-ab5a-40c379dabdaa") : secret "canary-serving-cert" not found Apr 22 18:22:17.163366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:17.163334 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tx2k7" Apr 22 18:22:17.197407 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:17.197200 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tx2k7" podStartSLOduration=97.616907588 podStartE2EDuration="1m40.197184795s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:21:42.847418391 +0000 UTC m=+66.556764040" lastFinishedPulling="2026-04-22 18:21:45.427695611 +0000 UTC m=+69.137041247" observedRunningTime="2026-04-22 18:21:46.175483425 +0000 UTC m=+69.884829079" watchObservedRunningTime="2026-04-22 18:22:17.197184795 +0000 UTC m=+100.906530443" Apr 22 18:22:46.596363 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:46.596310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:22:46.596866 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:46.596465 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:22:46.596866 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:46.596533 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs podName:1f085cfa-07bb-457b-85ce-79f190f3ecb1 nodeName:}" failed. No retries permitted until 2026-04-22 18:24:48.596515675 +0000 UTC m=+252.305861306 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs") pod "network-metrics-daemon-dhwbm" (UID: "1f085cfa-07bb-457b-85ce-79f190f3ecb1") : secret "metrics-daemon-secret" not found Apr 22 18:22:49.970801 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.970768 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7"] Apr 22 18:22:49.972670 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.972653 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fx8gr"] Apr 22 18:22:49.972813 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.972794 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:49.974254 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.974223 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4"] Apr 22 18:22:49.974368 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.974343 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:49.975195 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.975178 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:22:49.975684 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.975668 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:22:49.975768 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.975732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:22:49.975824 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.975789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:22:49.975879 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.975847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-9tklp\"" Apr 22 18:22:49.976022 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.976004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:49.976645 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.976628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:22:49.976807 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.976790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:22:49.976861 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.976834 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:22:49.976925 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.976911 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:22:49.978129 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.978111 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-gtlsf\"" Apr 22 18:22:49.979295 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.979276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:22:49.980781 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.980759 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:22:49.980869 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.980785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:22:49.980869 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.980759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-kxs9s\"" Apr 22 18:22:49.984508 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.984492 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:22:49.991932 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.991908 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4"] Apr 22 18:22:49.996153 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.996119 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7"] Apr 22 18:22:49.996353 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:49.996322 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fx8gr"] Apr 22 18:22:50.066349 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.066314 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg"] Apr 22 18:22:50.068219 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.068203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.071821 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.071785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:22:50.072453 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.072433 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:22:50.073011 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.072993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:22:50.073073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.073013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5gx9k\"" Apr 22 18:22:50.073073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.073053 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:22:50.077046 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.077024 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-67569b586-r5chq"] Apr 22 18:22:50.078807 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.078794 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.081838 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.081632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:22:50.081988 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.081967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:22:50.082177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.082147 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:22:50.082296 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.082202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:22:50.082370 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.082299 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:22:50.082427 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.082403 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-7vjlt\"" Apr 22 18:22:50.082486 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.082431 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:22:50.082572 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.082553 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg"] Apr 22 18:22:50.098419 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.098393 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-67569b586-r5chq"] Apr 22 18:22:50.122057 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sx84\" (UniqueName: \"kubernetes.io/projected/ad8365e0-e003-4937-9dbd-1989580ac1f4-kube-api-access-5sx84\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.122057 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122059 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzrvn\" (UniqueName: \"kubernetes.io/projected/2ee4fcb8-c34f-4bf2-9d97-11586231a008-kube-api-access-bzrvn\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:50.122267 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c400b749-c41a-4dc5-908a-d49ec568c6d6-config\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.122267 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c400b749-c41a-4dc5-908a-d49ec568c6d6-trusted-ca\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.122267 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad8365e0-e003-4937-9dbd-1989580ac1f4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.122267 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8365e0-e003-4937-9dbd-1989580ac1f4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.122267 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptljk\" (UniqueName: \"kubernetes.io/projected/c400b749-c41a-4dc5-908a-d49ec568c6d6-kube-api-access-ptljk\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.122425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c400b749-c41a-4dc5-908a-d49ec568c6d6-serving-cert\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.122425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.122425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8scd\" (UniqueName: \"kubernetes.io/projected/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-kube-api-access-j8scd\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.122425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:50.122425 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.122343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.223087 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.222995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad8365e0-e003-4937-9dbd-1989580ac1f4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.223087 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8365e0-e003-4937-9dbd-1989580ac1f4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.223087 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sx84\" (UniqueName: \"kubernetes.io/projected/ad8365e0-e003-4937-9dbd-1989580ac1f4-kube-api-access-5sx84\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptljk\" (UniqueName: \"kubernetes.io/projected/c400b749-c41a-4dc5-908a-d49ec568c6d6-kube-api-access-ptljk\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c400b749-c41a-4dc5-908a-d49ec568c6d6-serving-cert\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8scd\" (UniqueName: \"kubernetes.io/projected/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-kube-api-access-j8scd\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzrvn\" (UniqueName: \"kubernetes.io/projected/2ee4fcb8-c34f-4bf2-9d97-11586231a008-kube-api-access-bzrvn\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-default-certificate\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5znm\" (UniqueName: \"kubernetes.io/projected/7852086c-5476-41cd-9f89-8347b77c52a6-kube-api-access-z5znm\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c400b749-c41a-4dc5-908a-d49ec568c6d6-config\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.223399 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c400b749-c41a-4dc5-908a-d49ec568c6d6-trusted-ca\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-stats-auth\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.223632 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.223640 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.223668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8365e0-e003-4937-9dbd-1989580ac1f4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.223691 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls podName:2ee4fcb8-c34f-4bf2-9d97-11586231a008 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:50.723673844 +0000 UTC m=+134.433019481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kt4g4" (UID: "2ee4fcb8-c34f-4bf2-9d97-11586231a008") : secret "samples-operator-tls" not found Apr 22 18:22:50.223883 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.223712 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls podName:3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:50.723698457 +0000 UTC m=+134.433044094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lwbgg" (UID: "3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:50.224278 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.224238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c400b749-c41a-4dc5-908a-d49ec568c6d6-trusted-ca\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.224356 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.224334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.224639 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.224617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c400b749-c41a-4dc5-908a-d49ec568c6d6-config\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.225526 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.225502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad8365e0-e003-4937-9dbd-1989580ac1f4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.225890 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.225874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c400b749-c41a-4dc5-908a-d49ec568c6d6-serving-cert\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.234116 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.234090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzrvn\" (UniqueName: \"kubernetes.io/projected/2ee4fcb8-c34f-4bf2-9d97-11586231a008-kube-api-access-bzrvn\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:50.234852 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.234825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8scd\" (UniqueName: \"kubernetes.io/projected/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-kube-api-access-j8scd\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.235436 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.235408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sx84\" (UniqueName: \"kubernetes.io/projected/ad8365e0-e003-4937-9dbd-1989580ac1f4-kube-api-access-5sx84\") pod \"kube-storage-version-migrator-operator-6769c5d45-9ndl7\" (UID: \"ad8365e0-e003-4937-9dbd-1989580ac1f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.236059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.236043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptljk\" (UniqueName: \"kubernetes.io/projected/c400b749-c41a-4dc5-908a-d49ec568c6d6-kube-api-access-ptljk\") pod \"console-operator-9d4b6777b-fx8gr\" (UID: \"c400b749-c41a-4dc5-908a-d49ec568c6d6\") " pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.282828 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.282798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" Apr 22 18:22:50.289481 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.289460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:22:50.324182 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.324132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-default-certificate\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.324385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.324190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5znm\" (UniqueName: \"kubernetes.io/projected/7852086c-5476-41cd-9f89-8347b77c52a6-kube-api-access-z5znm\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.324385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.324236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.324385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.324292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.324385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.324328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-stats-auth\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.324595 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.324406 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:22:50.324595 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.324411 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:50.824388352 +0000 UTC m=+134.533734005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : configmap references non-existent config key: service-ca.crt Apr 22 18:22:50.324595 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.324470 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:50.824453597 +0000 UTC m=+134.533799238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : secret "router-metrics-certs-default" not found Apr 22 18:22:50.328048 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.327626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-default-certificate\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.329745 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.329694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-stats-auth\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.335972 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.335786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5znm\" (UniqueName: \"kubernetes.io/projected/7852086c-5476-41cd-9f89-8347b77c52a6-kube-api-access-z5znm\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.410496 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.410463 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7"] Apr 22 18:22:50.415688 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:22:50.415661 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad8365e0_e003_4937_9dbd_1989580ac1f4.slice/crio-10277b35f50c29d2af1c1739920d5e8fb813c4f1c5210e8921bf2d913fef44c2 WatchSource:0}: Error finding container 10277b35f50c29d2af1c1739920d5e8fb813c4f1c5210e8921bf2d913fef44c2: Status 404 returned error can't find the container with id 10277b35f50c29d2af1c1739920d5e8fb813c4f1c5210e8921bf2d913fef44c2 Apr 22 18:22:50.425280 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.425255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fx8gr"] Apr 22 18:22:50.427622 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:22:50.427603 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc400b749_c41a_4dc5_908a_d49ec568c6d6.slice/crio-06b99b0c251d049e06b1735b8c21df04de85aea0fb77e38e8c0e980eb124cd96 WatchSource:0}: Error finding container 06b99b0c251d049e06b1735b8c21df04de85aea0fb77e38e8c0e980eb124cd96: Status 404 returned error can't find the container with id 06b99b0c251d049e06b1735b8c21df04de85aea0fb77e38e8c0e980eb124cd96 Apr 22 18:22:50.728462 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.728421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:50.728462 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.728463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:50.728672 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.728581 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:50.728672 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.728646 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:22:50.728739 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.728649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls podName:3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:51.728633031 +0000 UTC m=+135.437978662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lwbgg" (UID: "3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:50.728739 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.728706 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls podName:2ee4fcb8-c34f-4bf2-9d97-11586231a008 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:51.728692978 +0000 UTC m=+135.438038613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kt4g4" (UID: "2ee4fcb8-c34f-4bf2-9d97-11586231a008") : secret "samples-operator-tls" not found Apr 22 18:22:50.829902 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.829861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.830059 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:50.829909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:50.830059 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.830044 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:51.830022916 +0000 UTC m=+135.539368571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : configmap references non-existent config key: service-ca.crt Apr 22 18:22:50.830143 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.830091 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:22:50.830179 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:50.830147 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:51.830133464 +0000 UTC m=+135.539479097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : secret "router-metrics-certs-default" not found Apr 22 18:22:51.282112 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:51.282070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" event={"ID":"c400b749-c41a-4dc5-908a-d49ec568c6d6","Type":"ContainerStarted","Data":"06b99b0c251d049e06b1735b8c21df04de85aea0fb77e38e8c0e980eb124cd96"} Apr 22 18:22:51.283439 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:51.283379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" event={"ID":"ad8365e0-e003-4937-9dbd-1989580ac1f4","Type":"ContainerStarted","Data":"10277b35f50c29d2af1c1739920d5e8fb813c4f1c5210e8921bf2d913fef44c2"} Apr 22 18:22:51.739365 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:51.739321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:51.739365 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:51.739374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:51.739742 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.739500 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:51.739742 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.739534 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:22:51.739742 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.739585 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls podName:2ee4fcb8-c34f-4bf2-9d97-11586231a008 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:53.739565476 +0000 UTC m=+137.448911106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kt4g4" (UID: "2ee4fcb8-c34f-4bf2-9d97-11586231a008") : secret "samples-operator-tls" not found Apr 22 18:22:51.739742 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.739610 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls podName:3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:53.739599707 +0000 UTC m=+137.448945342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lwbgg" (UID: "3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:51.840297 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:51.840257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:51.840297 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:51.840304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:51.840579 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.840440 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:22:51.840579 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.840460 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:53.840436621 +0000 UTC m=+137.549782259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : configmap references non-existent config key: service-ca.crt Apr 22 18:22:51.840579 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:51.840489 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:53.84048071 +0000 UTC m=+137.549826344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : secret "router-metrics-certs-default" not found Apr 22 18:22:53.288795 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.288746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" event={"ID":"ad8365e0-e003-4937-9dbd-1989580ac1f4","Type":"ContainerStarted","Data":"f6c435a7c2e7c7b64e45cf7e1ddfddcc6911498ea9b4ee04a34294f9132a765a"} Apr 22 18:22:53.290204 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.290183 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/0.log" Apr 22 18:22:53.290335 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.290219 2575 generic.go:358] "Generic (PLEG): container finished" podID="c400b749-c41a-4dc5-908a-d49ec568c6d6" containerID="9d8b5bd03c157b979d57073ccd9766903458fc4c89dbac7f507bb3733f79926c" exitCode=255 Apr 22 18:22:53.290335 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.290269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" event={"ID":"c400b749-c41a-4dc5-908a-d49ec568c6d6","Type":"ContainerDied","Data":"9d8b5bd03c157b979d57073ccd9766903458fc4c89dbac7f507bb3733f79926c"} Apr 22 18:22:53.290490 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.290476 2575 scope.go:117] "RemoveContainer" containerID="9d8b5bd03c157b979d57073ccd9766903458fc4c89dbac7f507bb3733f79926c" Apr 22 18:22:53.315558 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.315510 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" podStartSLOduration=2.204719194 podStartE2EDuration="4.315496535s" podCreationTimestamp="2026-04-22 18:22:49 +0000 UTC" firstStartedPulling="2026-04-22 18:22:50.417431426 +0000 UTC m=+134.126777061" lastFinishedPulling="2026-04-22 18:22:52.528208769 +0000 UTC m=+136.237554402" observedRunningTime="2026-04-22 18:22:53.315399861 +0000 UTC m=+137.024745526" watchObservedRunningTime="2026-04-22 18:22:53.315496535 +0000 UTC m=+137.024842189" Apr 22 18:22:53.760506 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.760467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:53.760506 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.760510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:53.760736 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.760616 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:53.760736 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.760683 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls podName:3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:57.760667393 +0000 UTC m=+141.470013029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lwbgg" (UID: "3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:53.760736 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.760623 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:22:53.760845 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.760765 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls podName:2ee4fcb8-c34f-4bf2-9d97-11586231a008 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:57.760747561 +0000 UTC m=+141.470093198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kt4g4" (UID: "2ee4fcb8-c34f-4bf2-9d97-11586231a008") : secret "samples-operator-tls" not found Apr 22 18:22:53.861836 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.861789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:53.861836 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:53.861833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:53.862048 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.861935 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:22:53.862048 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.861972 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:57.861954677 +0000 UTC m=+141.571300308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : configmap references non-existent config key: service-ca.crt Apr 22 18:22:53.862048 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:53.861998 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:57.861989713 +0000 UTC m=+141.571335344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : secret "router-metrics-certs-default" not found Apr 22 18:22:54.293640 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:54.293611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/1.log" Apr 22 18:22:54.294086 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:54.294013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/0.log" Apr 22 18:22:54.294086 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:54.294047 2575 generic.go:358] "Generic (PLEG): container finished" podID="c400b749-c41a-4dc5-908a-d49ec568c6d6" containerID="1d5c10b1c6bf1ea0cf12b7b25f2aba354f6ff3ca3d559fc5515f79388e777d0e" exitCode=255 Apr 22 18:22:54.294169 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:54.294084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" event={"ID":"c400b749-c41a-4dc5-908a-d49ec568c6d6","Type":"ContainerDied","Data":"1d5c10b1c6bf1ea0cf12b7b25f2aba354f6ff3ca3d559fc5515f79388e777d0e"} Apr 22 18:22:54.294169 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:54.294129 2575 scope.go:117] "RemoveContainer" containerID="9d8b5bd03c157b979d57073ccd9766903458fc4c89dbac7f507bb3733f79926c" Apr 22 18:22:54.294435 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:54.294411 2575 scope.go:117] "RemoveContainer" containerID="1d5c10b1c6bf1ea0cf12b7b25f2aba354f6ff3ca3d559fc5515f79388e777d0e" Apr 22 18:22:54.294645 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:54.294619 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fx8gr_openshift-console-operator(c400b749-c41a-4dc5-908a-d49ec568c6d6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" podUID="c400b749-c41a-4dc5-908a-d49ec568c6d6" Apr 22 18:22:55.297688 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:55.297660 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/1.log" Apr 22 18:22:55.298068 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:55.297989 2575 scope.go:117] "RemoveContainer" containerID="1d5c10b1c6bf1ea0cf12b7b25f2aba354f6ff3ca3d559fc5515f79388e777d0e" Apr 22 18:22:55.298168 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:55.298150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fx8gr_openshift-console-operator(c400b749-c41a-4dc5-908a-d49ec568c6d6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" podUID="c400b749-c41a-4dc5-908a-d49ec568c6d6" Apr 22 18:22:56.426887 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.426856 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-q425x"] Apr 22 18:22:56.428751 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.428735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.431492 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.431470 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-p6x94\"" Apr 22 18:22:56.431907 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.431889 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:22:56.433011 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.432987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:22:56.433142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.433126 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:22:56.433257 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.433227 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:22:56.439805 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.439783 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-q425x"] Apr 22 18:22:56.585117 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.585079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36c89fa6-510b-4d64-accf-ae9f90020ee3-signing-cabundle\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.585117 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.585122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36c89fa6-510b-4d64-accf-ae9f90020ee3-signing-key\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.585364 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.585180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4swz4\" (UniqueName: \"kubernetes.io/projected/36c89fa6-510b-4d64-accf-ae9f90020ee3-kube-api-access-4swz4\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.686309 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.686214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36c89fa6-510b-4d64-accf-ae9f90020ee3-signing-cabundle\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.686398 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.686378 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36c89fa6-510b-4d64-accf-ae9f90020ee3-signing-key\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.686444 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.686431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4swz4\" (UniqueName: \"kubernetes.io/projected/36c89fa6-510b-4d64-accf-ae9f90020ee3-kube-api-access-4swz4\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.686853 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.686826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/36c89fa6-510b-4d64-accf-ae9f90020ee3-signing-cabundle\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.688736 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.688718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/36c89fa6-510b-4d64-accf-ae9f90020ee3-signing-key\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.695370 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.695345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4swz4\" (UniqueName: \"kubernetes.io/projected/36c89fa6-510b-4d64-accf-ae9f90020ee3-kube-api-access-4swz4\") pod \"service-ca-865cb79987-q425x\" (UID: \"36c89fa6-510b-4d64-accf-ae9f90020ee3\") " pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.737554 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.737517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-q425x" Apr 22 18:22:56.858484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:56.858452 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-q425x"] Apr 22 18:22:56.861808 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:22:56.861776 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c89fa6_510b_4d64_accf_ae9f90020ee3.slice/crio-8dd309f106c60d6ddc79e0d45c44baea5954188db1fe85f74aaf07cc3af36b9e WatchSource:0}: Error finding container 8dd309f106c60d6ddc79e0d45c44baea5954188db1fe85f74aaf07cc3af36b9e: Status 404 returned error can't find the container with id 8dd309f106c60d6ddc79e0d45c44baea5954188db1fe85f74aaf07cc3af36b9e Apr 22 18:22:57.302759 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.302727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-q425x" event={"ID":"36c89fa6-510b-4d64-accf-ae9f90020ee3","Type":"ContainerStarted","Data":"8dd309f106c60d6ddc79e0d45c44baea5954188db1fe85f74aaf07cc3af36b9e"} Apr 22 18:22:57.333187 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.333165 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xxh8b_fb638d9e-ea2e-4a2e-979e-308022903fd1/dns-node-resolver/0.log" Apr 22 18:22:57.797643 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.797594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:22:57.797643 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.797650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:22:57.798102 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.797767 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:57.798102 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.797834 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:22:57.798102 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.797848 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls podName:3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:05.79782446 +0000 UTC m=+149.507170094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lwbgg" (UID: "3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:22:57.798102 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.797915 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls podName:2ee4fcb8-c34f-4bf2-9d97-11586231a008 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:05.79789557 +0000 UTC m=+149.507241213 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kt4g4" (UID: "2ee4fcb8-c34f-4bf2-9d97-11586231a008") : secret "samples-operator-tls" not found Apr 22 18:22:57.898671 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.898631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:57.898861 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.898787 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:22:57.898861 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.898806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:22:57.898861 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.898850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:05.898835456 +0000 UTC m=+149.608181103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : secret "router-metrics-certs-default" not found Apr 22 18:22:57.899008 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:22:57.898923 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle podName:7852086c-5476-41cd-9f89-8347b77c52a6 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:05.898907203 +0000 UTC m=+149.608252841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle") pod "router-default-67569b586-r5chq" (UID: "7852086c-5476-41cd-9f89-8347b77c52a6") : configmap references non-existent config key: service-ca.crt Apr 22 18:22:57.936217 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:57.936184 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4nkxw_073609fd-8186-41f7-860d-4fd136656e3f/node-ca/0.log" Apr 22 18:22:59.312405 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:59.312369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-q425x" event={"ID":"36c89fa6-510b-4d64-accf-ae9f90020ee3","Type":"ContainerStarted","Data":"4995bb56010f4662178321554618626eeb88e52efc9c0609603402c187549bea"} Apr 22 18:22:59.335652 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:59.335585 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-q425x" podStartSLOduration=1.65723492 podStartE2EDuration="3.335568655s" podCreationTimestamp="2026-04-22 18:22:56 +0000 UTC" firstStartedPulling="2026-04-22 18:22:56.863589383 +0000 UTC m=+140.572935014" lastFinishedPulling="2026-04-22 18:22:58.541923118 +0000 UTC m=+142.251268749" observedRunningTime="2026-04-22 18:22:59.334390821 +0000 UTC m=+143.043736474" watchObservedRunningTime="2026-04-22 18:22:59.335568655 +0000 UTC m=+143.044914359" Apr 22 18:22:59.737683 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:22:59.737645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9ndl7_ad8365e0-e003-4937-9dbd-1989580ac1f4/kube-storage-version-migrator-operator/0.log" Apr 22 18:23:00.290404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:00.290352 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:23:00.290404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:00.290405 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:23:00.290908 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:00.290892 2575 scope.go:117] "RemoveContainer" containerID="1d5c10b1c6bf1ea0cf12b7b25f2aba354f6ff3ca3d559fc5515f79388e777d0e" Apr 22 18:23:00.291154 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:00.291132 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fx8gr_openshift-console-operator(c400b749-c41a-4dc5-908a-d49ec568c6d6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" podUID="c400b749-c41a-4dc5-908a-d49ec568c6d6" Apr 22 18:23:05.867754 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.867714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:23:05.867754 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.867754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:23:05.868269 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:05.867886 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:23:05.868269 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:05.867980 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls podName:3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:21.867957463 +0000 UTC m=+165.577303096 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lwbgg" (UID: "3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:23:05.870043 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.870013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ee4fcb8-c34f-4bf2-9d97-11586231a008-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kt4g4\" (UID: \"2ee4fcb8-c34f-4bf2-9d97-11586231a008\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:23:05.899041 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.899017 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" Apr 22 18:23:05.970827 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.969711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:05.970827 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.969766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:05.970827 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.970761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7852086c-5476-41cd-9f89-8347b77c52a6-service-ca-bundle\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:05.976439 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.976272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7852086c-5476-41cd-9f89-8347b77c52a6-metrics-certs\") pod \"router-default-67569b586-r5chq\" (UID: \"7852086c-5476-41cd-9f89-8347b77c52a6\") " pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:05.987410 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:05.987377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:06.040235 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.039802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4"] Apr 22 18:23:06.115832 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.115806 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-67569b586-r5chq"] Apr 22 18:23:06.118813 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:06.118757 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7852086c_5476_41cd_9f89_8347b77c52a6.slice/crio-23a2f25217d4a95bcbf99b11f7caa44f5a9284ce5f03c80344eed59c8f44da9e WatchSource:0}: Error finding container 23a2f25217d4a95bcbf99b11f7caa44f5a9284ce5f03c80344eed59c8f44da9e: Status 404 returned error can't find the container with id 23a2f25217d4a95bcbf99b11f7caa44f5a9284ce5f03c80344eed59c8f44da9e Apr 22 18:23:06.334228 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.334191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" event={"ID":"2ee4fcb8-c34f-4bf2-9d97-11586231a008","Type":"ContainerStarted","Data":"b9b392e94b8d07be05c37ea6eb8228e9b05fb85db600e10feaa716a21058728f"} Apr 22 18:23:06.335497 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.335471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-67569b586-r5chq" event={"ID":"7852086c-5476-41cd-9f89-8347b77c52a6","Type":"ContainerStarted","Data":"08c9915fece04ab9d067365aef70e0a9faacbcd8a1196126768077dc44453017"} Apr 22 18:23:06.335611 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.335501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-67569b586-r5chq" event={"ID":"7852086c-5476-41cd-9f89-8347b77c52a6","Type":"ContainerStarted","Data":"23a2f25217d4a95bcbf99b11f7caa44f5a9284ce5f03c80344eed59c8f44da9e"} Apr 22 18:23:06.359842 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.359789 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-67569b586-r5chq" podStartSLOduration=16.359775102 podStartE2EDuration="16.359775102s" podCreationTimestamp="2026-04-22 18:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:06.359110033 +0000 UTC m=+150.068455684" watchObservedRunningTime="2026-04-22 18:23:06.359775102 +0000 UTC m=+150.069120756" Apr 22 18:23:06.988303 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.988272 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:06.991183 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:06.991159 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:07.339752 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:07.339672 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:07.341009 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:07.340980 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-67569b586-r5chq" Apr 22 18:23:08.343686 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:08.343648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" event={"ID":"2ee4fcb8-c34f-4bf2-9d97-11586231a008","Type":"ContainerStarted","Data":"19ce605698fb4f27afe981d8ca53b21f25fc1b947d795847c5b3656d532a20be"} Apr 22 18:23:08.343686 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:08.343686 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" event={"ID":"2ee4fcb8-c34f-4bf2-9d97-11586231a008","Type":"ContainerStarted","Data":"85881e596a24f4a4271e4cd619602b6b209b018ada16ec3c96951273ff9005c7"} Apr 22 18:23:08.367497 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:08.367450 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kt4g4" podStartSLOduration=17.913234346 podStartE2EDuration="19.367436594s" podCreationTimestamp="2026-04-22 18:22:49 +0000 UTC" firstStartedPulling="2026-04-22 18:23:06.085432843 +0000 UTC m=+149.794778474" lastFinishedPulling="2026-04-22 18:23:07.539635088 +0000 UTC m=+151.248980722" observedRunningTime="2026-04-22 18:23:08.366917709 +0000 UTC m=+152.076263363" watchObservedRunningTime="2026-04-22 18:23:08.367436594 +0000 UTC m=+152.076782292" Apr 22 18:23:12.647052 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:12.647006 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-646b958546-7n687" podUID="1add8823-8c00-46ae-a8af-828b95cc217f" Apr 22 18:23:12.673026 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:12.672987 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6wjzz" podUID="047091e6-d56c-4d8b-8391-f6285a93c154" Apr 22 18:23:12.699140 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:12.699111 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4kzvf" podUID="fa8d3113-51fb-4375-ab5a-40c379dabdaa" Apr 22 18:23:12.906937 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:12.906858 2575 scope.go:117] "RemoveContainer" containerID="1d5c10b1c6bf1ea0cf12b7b25f2aba354f6ff3ca3d559fc5515f79388e777d0e" Apr 22 18:23:13.358151 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.358131 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/1.log" Apr 22 18:23:13.358290 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.358264 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" event={"ID":"c400b749-c41a-4dc5-908a-d49ec568c6d6","Type":"ContainerStarted","Data":"dd8a120646ac3a8905013a4865f0d3fbce036671742d1a97318cc1369a2b06c3"} Apr 22 18:23:13.358359 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.358313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wjzz" Apr 22 18:23:13.358359 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.358279 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:23:13.358655 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.358636 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:23:13.379733 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.379695 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" podStartSLOduration=22.282385674 podStartE2EDuration="24.379684647s" podCreationTimestamp="2026-04-22 18:22:49 +0000 UTC" firstStartedPulling="2026-04-22 18:22:50.429262238 +0000 UTC m=+134.138607868" lastFinishedPulling="2026-04-22 18:22:52.526561207 +0000 UTC m=+136.235906841" observedRunningTime="2026-04-22 18:23:13.378451332 +0000 UTC m=+157.087796985" watchObservedRunningTime="2026-04-22 18:23:13.379684647 +0000 UTC m=+157.089030300" Apr 22 18:23:13.429119 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:13.429098 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fx8gr" Apr 22 18:23:13.916460 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:13.916414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dhwbm" podUID="1f085cfa-07bb-457b-85ce-79f190f3ecb1" Apr 22 18:23:17.562887 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.562850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:23:17.563453 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.562919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:23:17.565431 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.565400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"image-registry-646b958546-7n687\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:23:17.565557 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.565477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/047091e6-d56c-4d8b-8391-f6285a93c154-metrics-tls\") pod \"dns-default-6wjzz\" (UID: \"047091e6-d56c-4d8b-8391-f6285a93c154\") " pod="openshift-dns/dns-default-6wjzz" Apr 22 18:23:17.664213 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.664180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:23:17.666318 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.666298 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa8d3113-51fb-4375-ab5a-40c379dabdaa-cert\") pod \"ingress-canary-4kzvf\" (UID: \"fa8d3113-51fb-4375-ab5a-40c379dabdaa\") " pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:23:17.863347 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.863289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pkrg\"" Apr 22 18:23:17.863437 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.863386 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d4f9\"" Apr 22 18:23:17.868697 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.868681 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wjzz" Apr 22 18:23:17.868768 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:17.868755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4kzvf" Apr 22 18:23:18.000639 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.000609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6wjzz"] Apr 22 18:23:18.003884 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:18.003856 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047091e6_d56c_4d8b_8391_f6285a93c154.slice/crio-9a28634fc664d55bff97024f64d49bb083c7b3f134f80fadfc5ed5392df8653b WatchSource:0}: Error finding container 9a28634fc664d55bff97024f64d49bb083c7b3f134f80fadfc5ed5392df8653b: Status 404 returned error can't find the container with id 9a28634fc664d55bff97024f64d49bb083c7b3f134f80fadfc5ed5392df8653b Apr 22 18:23:18.029445 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.029421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4kzvf"] Apr 22 18:23:18.033479 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:18.033458 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa8d3113_51fb_4375_ab5a_40c379dabdaa.slice/crio-76adb8334d06deb891aab7f83ced4516df2b53ca55650a1f84b434e249fd687c WatchSource:0}: Error finding container 76adb8334d06deb891aab7f83ced4516df2b53ca55650a1f84b434e249fd687c: Status 404 returned error can't find the container with id 76adb8334d06deb891aab7f83ced4516df2b53ca55650a1f84b434e249fd687c Apr 22 18:23:18.100533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.100507 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-56zln"] Apr 22 18:23:18.103174 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.103156 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.106039 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.105997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:23:18.106123 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.106070 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-4q4kl\"" Apr 22 18:23:18.106342 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.106326 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:23:18.111973 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.111954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-56zln"] Apr 22 18:23:18.168339 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.168280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9be38d04-fbc0-4977-b082-c0568fd4d108-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-56zln\" (UID: \"9be38d04-fbc0-4977-b082-c0568fd4d108\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.168339 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.168320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9be38d04-fbc0-4977-b082-c0568fd4d108-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-56zln\" (UID: \"9be38d04-fbc0-4977-b082-c0568fd4d108\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.188284 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.188263 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zgzrx"] Apr 22 18:23:18.190601 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.190585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.193795 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.193777 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:23:18.193898 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.193777 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:23:18.193898 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.193777 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:23:18.193898 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.193826 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4b6kd\"" Apr 22 18:23:18.193898 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.193882 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:23:18.208954 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.208935 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zgzrx"] Apr 22 18:23:18.268873 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.268845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/290f97ac-a696-4ad0-a131-8cee16862b82-data-volume\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.268984 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.268878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/290f97ac-a696-4ad0-a131-8cee16862b82-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.268984 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.268914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9be38d04-fbc0-4977-b082-c0568fd4d108-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-56zln\" (UID: \"9be38d04-fbc0-4977-b082-c0568fd4d108\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.269046 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.268975 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9be38d04-fbc0-4977-b082-c0568fd4d108-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-56zln\" (UID: \"9be38d04-fbc0-4977-b082-c0568fd4d108\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.269088 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.269040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/290f97ac-a696-4ad0-a131-8cee16862b82-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.269088 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.269081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ptw\" (UniqueName: \"kubernetes.io/projected/290f97ac-a696-4ad0-a131-8cee16862b82-kube-api-access-b8ptw\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.269168 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.269136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/290f97ac-a696-4ad0-a131-8cee16862b82-crio-socket\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.269638 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.269619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9be38d04-fbc0-4977-b082-c0568fd4d108-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-56zln\" (UID: \"9be38d04-fbc0-4977-b082-c0568fd4d108\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.271421 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.271402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9be38d04-fbc0-4977-b082-c0568fd4d108-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-56zln\" (UID: \"9be38d04-fbc0-4977-b082-c0568fd4d108\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.369665 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.369642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/290f97ac-a696-4ad0-a131-8cee16862b82-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.369758 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.369678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ptw\" (UniqueName: \"kubernetes.io/projected/290f97ac-a696-4ad0-a131-8cee16862b82-kube-api-access-b8ptw\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.369828 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.369808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/290f97ac-a696-4ad0-a131-8cee16862b82-crio-socket\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.369872 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.369854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/290f97ac-a696-4ad0-a131-8cee16862b82-data-volume\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.369917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.369885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/290f97ac-a696-4ad0-a131-8cee16862b82-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.369965 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.369942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/290f97ac-a696-4ad0-a131-8cee16862b82-crio-socket\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.370188 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.370171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/290f97ac-a696-4ad0-a131-8cee16862b82-data-volume\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.370987 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.370970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/290f97ac-a696-4ad0-a131-8cee16862b82-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.371828 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.371808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/290f97ac-a696-4ad0-a131-8cee16862b82-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.372471 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.372450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wjzz" event={"ID":"047091e6-d56c-4d8b-8391-f6285a93c154","Type":"ContainerStarted","Data":"9a28634fc664d55bff97024f64d49bb083c7b3f134f80fadfc5ed5392df8653b"} Apr 22 18:23:18.373380 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.373362 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4kzvf" event={"ID":"fa8d3113-51fb-4375-ab5a-40c379dabdaa","Type":"ContainerStarted","Data":"76adb8334d06deb891aab7f83ced4516df2b53ca55650a1f84b434e249fd687c"} Apr 22 18:23:18.379527 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.379510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ptw\" (UniqueName: \"kubernetes.io/projected/290f97ac-a696-4ad0-a131-8cee16862b82-kube-api-access-b8ptw\") pod \"insights-runtime-extractor-zgzrx\" (UID: \"290f97ac-a696-4ad0-a131-8cee16862b82\") " pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.411964 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.411942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" Apr 22 18:23:18.500343 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.500312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zgzrx" Apr 22 18:23:18.544268 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.543471 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-56zln"] Apr 22 18:23:18.548986 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:18.548954 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be38d04_fbc0_4977_b082_c0568fd4d108.slice/crio-4d8a3641b7925959a5fdb47d0595a5fd1af9a8e2964be82aeddbc28a7fe10d6a WatchSource:0}: Error finding container 4d8a3641b7925959a5fdb47d0595a5fd1af9a8e2964be82aeddbc28a7fe10d6a: Status 404 returned error can't find the container with id 4d8a3641b7925959a5fdb47d0595a5fd1af9a8e2964be82aeddbc28a7fe10d6a Apr 22 18:23:18.657291 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:18.657207 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zgzrx"] Apr 22 18:23:19.376989 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:19.376956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" event={"ID":"9be38d04-fbc0-4977-b082-c0568fd4d108","Type":"ContainerStarted","Data":"4d8a3641b7925959a5fdb47d0595a5fd1af9a8e2964be82aeddbc28a7fe10d6a"} Apr 22 18:23:19.378429 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:19.378400 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgzrx" event={"ID":"290f97ac-a696-4ad0-a131-8cee16862b82","Type":"ContainerStarted","Data":"fbfbd020656dd8c6e5d8ac83cecd7c82efba85f19276b062f0309f74efe8ce11"} Apr 22 18:23:19.378563 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:19.378431 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgzrx" event={"ID":"290f97ac-a696-4ad0-a131-8cee16862b82","Type":"ContainerStarted","Data":"2464e73c17e8101536c684c43583f3971b9e9f1304afdb010ea9e10ef5473536"} Apr 22 18:23:20.382379 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:20.382329 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" event={"ID":"9be38d04-fbc0-4977-b082-c0568fd4d108","Type":"ContainerStarted","Data":"86ca9e7d4e86af3b1bdc23fbfa581b613b93df229f66753cf45eaaa522b15176"} Apr 22 18:23:20.383941 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:20.383904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wjzz" event={"ID":"047091e6-d56c-4d8b-8391-f6285a93c154","Type":"ContainerStarted","Data":"a25a47364092b60bd1a5d3e9fe583fac2c45fae52285ef808fb7789288eaf134"} Apr 22 18:23:20.385573 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:20.385525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4kzvf" event={"ID":"fa8d3113-51fb-4375-ab5a-40c379dabdaa","Type":"ContainerStarted","Data":"b2b19788275e40dfc0e306de3e9ef80ddc443b058d4ed9b544e669df16b27141"} Apr 22 18:23:20.389461 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:20.389439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgzrx" event={"ID":"290f97ac-a696-4ad0-a131-8cee16862b82","Type":"ContainerStarted","Data":"73f7663876e4573edb1fe4275a1b2e4868c0cea79f178c935e4923f9002b83bb"} Apr 22 18:23:20.402890 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:20.402627 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-56zln" podStartSLOduration=0.731339239 podStartE2EDuration="2.402611899s" podCreationTimestamp="2026-04-22 18:23:18 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.551535195 +0000 UTC m=+162.260880830" lastFinishedPulling="2026-04-22 18:23:20.222807845 +0000 UTC m=+163.932153490" observedRunningTime="2026-04-22 18:23:20.40176173 +0000 UTC m=+164.111107396" watchObservedRunningTime="2026-04-22 18:23:20.402611899 +0000 UTC m=+164.111957554" Apr 22 18:23:21.393571 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:21.393537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wjzz" event={"ID":"047091e6-d56c-4d8b-8391-f6285a93c154","Type":"ContainerStarted","Data":"76c9d06b570a3647211602e0e4732363bd9ad561bc095de42362c9c9d4579541"} Apr 22 18:23:21.413680 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:21.413623 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6wjzz" podStartSLOduration=130.196362326 podStartE2EDuration="2m12.413609458s" podCreationTimestamp="2026-04-22 18:21:09 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.005551972 +0000 UTC m=+161.714897603" lastFinishedPulling="2026-04-22 18:23:20.222799086 +0000 UTC m=+163.932144735" observedRunningTime="2026-04-22 18:23:21.412207259 +0000 UTC m=+165.121552912" watchObservedRunningTime="2026-04-22 18:23:21.413609458 +0000 UTC m=+165.122955110" Apr 22 18:23:21.414132 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:21.414102 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4kzvf" podStartSLOduration=130.224503348 podStartE2EDuration="2m12.414094386s" podCreationTimestamp="2026-04-22 18:21:09 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.035186951 +0000 UTC m=+161.744532586" lastFinishedPulling="2026-04-22 18:23:20.224777979 +0000 UTC m=+163.934123624" observedRunningTime="2026-04-22 18:23:20.419460547 +0000 UTC m=+164.128806197" watchObservedRunningTime="2026-04-22 18:23:21.414094386 +0000 UTC m=+165.123440038" Apr 22 18:23:21.896453 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:21.896426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:23:21.898695 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:21.898675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lwbgg\" (UID: \"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:23:22.177845 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.177779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" Apr 22 18:23:22.295540 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.295508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg"] Apr 22 18:23:22.298434 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:22.298405 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd6a1fe_39a4_4ac5_aa49_e7b33a296fc3.slice/crio-ab77aa25648f0c490a912c01a2d384f8ee8ab65f31f6b3f3b7cf98b6147fce0a WatchSource:0}: Error finding container ab77aa25648f0c490a912c01a2d384f8ee8ab65f31f6b3f3b7cf98b6147fce0a: Status 404 returned error can't find the container with id ab77aa25648f0c490a912c01a2d384f8ee8ab65f31f6b3f3b7cf98b6147fce0a Apr 22 18:23:22.398048 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.398017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgzrx" event={"ID":"290f97ac-a696-4ad0-a131-8cee16862b82","Type":"ContainerStarted","Data":"24a3348cadc976549e8329f78cee88bf40de3bac5ddbd0e2dced08c15106b7ff"} Apr 22 18:23:22.398964 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.398944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" event={"ID":"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3","Type":"ContainerStarted","Data":"ab77aa25648f0c490a912c01a2d384f8ee8ab65f31f6b3f3b7cf98b6147fce0a"} Apr 22 18:23:22.399209 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.399190 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6wjzz" Apr 22 18:23:22.417370 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.417334 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zgzrx" podStartSLOduration=1.633184103 podStartE2EDuration="4.417322567s" podCreationTimestamp="2026-04-22 18:23:18 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.73540869 +0000 UTC m=+162.444754334" lastFinishedPulling="2026-04-22 18:23:21.519547167 +0000 UTC m=+165.228892798" observedRunningTime="2026-04-22 18:23:22.416653343 +0000 UTC m=+166.125998998" watchObservedRunningTime="2026-04-22 18:23:22.417322567 +0000 UTC m=+166.126668220" Apr 22 18:23:22.522705 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.522682 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74cb66595-98tks"] Apr 22 18:23:22.525489 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.525473 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.527876 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.527857 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:23:22.528159 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528096 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:23:22.528159 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528116 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:23:22.528159 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528140 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:23:22.528360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528179 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:23:22.528360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528196 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:23:22.528494 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528471 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:23:22.528494 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.528477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9k7fm\"" Apr 22 18:23:22.533816 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.533795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74cb66595-98tks"] Apr 22 18:23:22.601415 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.601396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6djvp\" (UniqueName: \"kubernetes.io/projected/6d4435e6-06eb-4d24-b614-d45e57fb704f-kube-api-access-6djvp\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.601515 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.601433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-service-ca\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.601515 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.601455 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-oauth-config\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.601515 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.601506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-config\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.601629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.601576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-serving-cert\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.601629 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.601601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-oauth-serving-cert\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.702812 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.702786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6djvp\" (UniqueName: \"kubernetes.io/projected/6d4435e6-06eb-4d24-b614-d45e57fb704f-kube-api-access-6djvp\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.702902 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.702831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-service-ca\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.702902 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.702853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-oauth-config\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.702902 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.702872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-config\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.702902 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.702891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-serving-cert\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.703053 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.702913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-oauth-serving-cert\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.703546 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.703518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-service-ca\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.703672 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.703650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-oauth-serving-cert\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.703749 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.703650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-config\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.705382 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.705361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-serving-cert\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.705464 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.705387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-oauth-config\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.711155 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.711134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6djvp\" (UniqueName: \"kubernetes.io/projected/6d4435e6-06eb-4d24-b614-d45e57fb704f-kube-api-access-6djvp\") pod \"console-74cb66595-98tks\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.835435 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.835370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:22.973946 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:22.973862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74cb66595-98tks"] Apr 22 18:23:22.976769 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:22.976728 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4435e6_06eb_4d24_b614_d45e57fb704f.slice/crio-f2779a33196b055c070212ca4b4b116974709c27e292fdbd6cae4e1a2c351780 WatchSource:0}: Error finding container f2779a33196b055c070212ca4b4b116974709c27e292fdbd6cae4e1a2c351780: Status 404 returned error can't find the container with id f2779a33196b055c070212ca4b4b116974709c27e292fdbd6cae4e1a2c351780 Apr 22 18:23:23.406148 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:23.405906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb66595-98tks" event={"ID":"6d4435e6-06eb-4d24-b614-d45e57fb704f","Type":"ContainerStarted","Data":"f2779a33196b055c070212ca4b4b116974709c27e292fdbd6cae4e1a2c351780"} Apr 22 18:23:24.411166 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:24.411035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" event={"ID":"3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3","Type":"ContainerStarted","Data":"4e5c3167f2f0f94996e98320e23bbc43c89ff9cd8fdc4b9a0a630a9c679c1a89"} Apr 22 18:23:24.428556 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:24.428507 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lwbgg" podStartSLOduration=32.904604625 podStartE2EDuration="34.428491084s" podCreationTimestamp="2026-04-22 18:22:50 +0000 UTC" firstStartedPulling="2026-04-22 18:23:22.300229746 +0000 UTC m=+166.009575377" lastFinishedPulling="2026-04-22 18:23:23.824116192 +0000 UTC m=+167.533461836" observedRunningTime="2026-04-22 18:23:24.428042437 +0000 UTC m=+168.137388091" watchObservedRunningTime="2026-04-22 18:23:24.428491084 +0000 UTC m=+168.137836736" Apr 22 18:23:26.418438 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:26.418394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb66595-98tks" event={"ID":"6d4435e6-06eb-4d24-b614-d45e57fb704f","Type":"ContainerStarted","Data":"0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b"} Apr 22 18:23:26.444056 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:26.444004 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74cb66595-98tks" podStartSLOduration=1.843393405 podStartE2EDuration="4.44399194s" podCreationTimestamp="2026-04-22 18:23:22 +0000 UTC" firstStartedPulling="2026-04-22 18:23:22.979076857 +0000 UTC m=+166.688422494" lastFinishedPulling="2026-04-22 18:23:25.579675384 +0000 UTC m=+169.289021029" observedRunningTime="2026-04-22 18:23:26.443716421 +0000 UTC m=+170.153062065" watchObservedRunningTime="2026-04-22 18:23:26.44399194 +0000 UTC m=+170.153337592" Apr 22 18:23:26.906748 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:26.906721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:23:26.909672 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:26.909651 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r4mpc\"" Apr 22 18:23:26.917005 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:26.916986 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:23:27.033652 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.033623 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-646b958546-7n687"] Apr 22 18:23:27.036587 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:27.036559 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1add8823_8c00_46ae_a8af_828b95cc217f.slice/crio-bc227359faf1ec867e8daf21155bdf2409d6177c1080f2ea68aa268346a078f5 WatchSource:0}: Error finding container bc227359faf1ec867e8daf21155bdf2409d6177c1080f2ea68aa268346a078f5: Status 404 returned error can't find the container with id bc227359faf1ec867e8daf21155bdf2409d6177c1080f2ea68aa268346a078f5 Apr 22 18:23:27.135154 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.135133 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66886898c8-6gkqw"] Apr 22 18:23:27.145376 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.145359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.150949 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.150924 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66886898c8-6gkqw"] Apr 22 18:23:27.153122 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.153098 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:23:27.238641 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-trusted-ca-bundle\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.238759 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-config\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.238759 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-oauth-serving-cert\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.238759 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-serving-cert\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.238870 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft26h\" (UniqueName: \"kubernetes.io/projected/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-kube-api-access-ft26h\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.238904 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-service-ca\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.238937 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.238899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-oauth-config\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-trusted-ca-bundle\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339683 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-config\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339683 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-oauth-serving-cert\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339683 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-serving-cert\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339683 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft26h\" (UniqueName: \"kubernetes.io/projected/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-kube-api-access-ft26h\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-service-ca\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.339883 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.339743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-oauth-config\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.340714 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.340447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-config\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.340714 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.340533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-oauth-serving-cert\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.340714 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.340576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-trusted-ca-bundle\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.340714 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.340668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-service-ca\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.342285 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.342263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-serving-cert\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.342371 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.342335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-oauth-config\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.347691 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.347672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft26h\" (UniqueName: \"kubernetes.io/projected/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-kube-api-access-ft26h\") pod \"console-66886898c8-6gkqw\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.426176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.426148 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-646b958546-7n687" event={"ID":"1add8823-8c00-46ae-a8af-828b95cc217f","Type":"ContainerStarted","Data":"b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1"} Apr 22 18:23:27.426176 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.426180 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-646b958546-7n687" event={"ID":"1add8823-8c00-46ae-a8af-828b95cc217f","Type":"ContainerStarted","Data":"bc227359faf1ec867e8daf21155bdf2409d6177c1080f2ea68aa268346a078f5"} Apr 22 18:23:27.454689 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.454650 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-646b958546-7n687" podStartSLOduration=170.454639925 podStartE2EDuration="2m50.454639925s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:27.453919397 +0000 UTC m=+171.163265060" watchObservedRunningTime="2026-04-22 18:23:27.454639925 +0000 UTC m=+171.163985612" Apr 22 18:23:27.455226 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.455206 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:27.587835 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:27.587812 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66886898c8-6gkqw"] Apr 22 18:23:27.590286 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:27.590261 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab83ab1_d8c4_4865_8f47_06a4e0e6fc1d.slice/crio-0cd79a88173f51ce3aead0fe13829b85e69d2e418a474ada92f1173e5d2cfe20 WatchSource:0}: Error finding container 0cd79a88173f51ce3aead0fe13829b85e69d2e418a474ada92f1173e5d2cfe20: Status 404 returned error can't find the container with id 0cd79a88173f51ce3aead0fe13829b85e69d2e418a474ada92f1173e5d2cfe20 Apr 22 18:23:28.429933 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:28.429903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66886898c8-6gkqw" event={"ID":"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d","Type":"ContainerStarted","Data":"e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857"} Apr 22 18:23:28.430313 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:28.429942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66886898c8-6gkqw" event={"ID":"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d","Type":"ContainerStarted","Data":"0cd79a88173f51ce3aead0fe13829b85e69d2e418a474ada92f1173e5d2cfe20"} Apr 22 18:23:28.430313 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:28.430040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:23:28.451911 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:28.451873 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66886898c8-6gkqw" podStartSLOduration=1.451862499 podStartE2EDuration="1.451862499s" podCreationTimestamp="2026-04-22 18:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:28.451040554 +0000 UTC m=+172.160386207" watchObservedRunningTime="2026-04-22 18:23:28.451862499 +0000 UTC m=+172.161208152" Apr 22 18:23:28.906326 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:28.906300 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:23:32.408619 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:32.408583 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6wjzz" Apr 22 18:23:32.836013 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:32.835964 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:32.836201 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:32.836022 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74cb66595-98tks" Apr 22 18:23:32.837123 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:32.837092 2575 patch_prober.go:28] interesting pod/console-74cb66595-98tks container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.17:8443/health\": dial tcp 10.132.0.17:8443: connect: connection refused" start-of-body= Apr 22 18:23:32.837223 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:32.837150 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-74cb66595-98tks" podUID="6d4435e6-06eb-4d24-b614-d45e57fb704f" containerName="console" probeResult="failure" output="Get \"https://10.132.0.17:8443/health\": dial tcp 10.132.0.17:8443: connect: connection refused" Apr 22 18:23:33.878687 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.878659 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-z8qr6"] Apr 22 18:23:33.884296 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.884279 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.888785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.888759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:23:33.888896 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.888759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:23:33.890124 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.890102 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:23:33.890337 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.890323 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7268c\"" Apr 22 18:23:33.890459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.890438 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:23:33.987888 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.987865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-tls\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.987987 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.987906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-textfile\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.987987 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.987925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rph7c\" (UniqueName: \"kubernetes.io/projected/07a60a53-3eab-4583-b7f4-5a08a4917cbc-kube-api-access-rph7c\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.987987 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.987945 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.988160 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.988007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-root\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.988160 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.988026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-accelerators-collector-config\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.988160 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.988115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-wtmp\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.988324 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.988162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07a60a53-3eab-4583-b7f4-5a08a4917cbc-metrics-client-ca\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:33.988324 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:33.988193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-sys\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088590 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-root\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088709 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-accelerators-collector-config\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088709 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-wtmp\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088709 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07a60a53-3eab-4583-b7f4-5a08a4917cbc-metrics-client-ca\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088709 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-root\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-sys\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-wtmp\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-tls\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a60a53-3eab-4583-b7f4-5a08a4917cbc-sys\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.088905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-textfile\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.089139 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rph7c\" (UniqueName: \"kubernetes.io/projected/07a60a53-3eab-4583-b7f4-5a08a4917cbc-kube-api-access-rph7c\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.089139 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.088960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.089265 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.089157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07a60a53-3eab-4583-b7f4-5a08a4917cbc-metrics-client-ca\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.089265 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.089230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-accelerators-collector-config\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.089641 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.089623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-textfile\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.090837 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.090812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-tls\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.091368 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.091351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07a60a53-3eab-4583-b7f4-5a08a4917cbc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.097290 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.097271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rph7c\" (UniqueName: \"kubernetes.io/projected/07a60a53-3eab-4583-b7f4-5a08a4917cbc-kube-api-access-rph7c\") pod \"node-exporter-z8qr6\" (UID: \"07a60a53-3eab-4583-b7f4-5a08a4917cbc\") " pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.193666 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.193610 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z8qr6" Apr 22 18:23:34.204450 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:34.204425 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a60a53_3eab_4583_b7f4_5a08a4917cbc.slice/crio-6f0e1ea402c0d24b8c2b213244002a0e15d836eb170a1a5b57355bec05a5e4d3 WatchSource:0}: Error finding container 6f0e1ea402c0d24b8c2b213244002a0e15d836eb170a1a5b57355bec05a5e4d3: Status 404 returned error can't find the container with id 6f0e1ea402c0d24b8c2b213244002a0e15d836eb170a1a5b57355bec05a5e4d3 Apr 22 18:23:34.448897 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.448826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z8qr6" event={"ID":"07a60a53-3eab-4583-b7f4-5a08a4917cbc","Type":"ContainerStarted","Data":"6f0e1ea402c0d24b8c2b213244002a0e15d836eb170a1a5b57355bec05a5e4d3"} Apr 22 18:23:34.837263 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.837214 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:23:34.841145 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.841129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.843905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.843884 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:23:34.843905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.843895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:23:34.844523 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844497 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:23:34.845142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844899 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:23:34.845142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:23:34.845142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844961 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:23:34.845142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:23:34.845142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mxpds\"" Apr 22 18:23:34.845142 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.844899 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:23:34.845533 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.845300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:23:34.857433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.857412 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:23:34.894055 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w294\" (UniqueName: \"kubernetes.io/projected/7ece467a-cc96-4960-a9e1-03c625e246be-kube-api-access-6w294\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894175 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ece467a-cc96-4960-a9e1-03c625e246be-config-out\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-web-config\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-config-volume\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894433 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ece467a-cc96-4960-a9e1-03c625e246be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.894946 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.894457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995172 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995172 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w294\" (UniqueName: \"kubernetes.io/projected/7ece467a-cc96-4960-a9e1-03c625e246be-kube-api-access-6w294\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ece467a-cc96-4960-a9e1-03c625e246be-config-out\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-web-config\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-config-volume\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ece467a-cc96-4960-a9e1-03c625e246be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.995946 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.995626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.996736 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:34.996184 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-trusted-ca-bundle podName:7ece467a-cc96-4960-a9e1-03c625e246be nodeName:}" failed. No retries permitted until 2026-04-22 18:23:35.496159751 +0000 UTC m=+179.205505408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "7ece467a-cc96-4960-a9e1-03c625e246be") : configmap references non-existent config key: ca-bundle.crt Apr 22 18:23:34.996736 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.996363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.996736 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.996412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.999475 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.999448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ece467a-cc96-4960-a9e1-03c625e246be-config-out\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.999662 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.999602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-config-volume\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:34.999662 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:34.999602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.000064 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.000039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.000284 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.000263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.000683 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.000661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.000768 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.000685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-web-config\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.002221 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.002195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7ece467a-cc96-4960-a9e1-03c625e246be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.002598 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.002573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ece467a-cc96-4960-a9e1-03c625e246be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.003035 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.003015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w294\" (UniqueName: \"kubernetes.io/projected/7ece467a-cc96-4960-a9e1-03c625e246be-kube-api-access-6w294\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.452749 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.452688 2575 generic.go:358] "Generic (PLEG): container finished" podID="07a60a53-3eab-4583-b7f4-5a08a4917cbc" containerID="b6dfba93e8fce725330376730c00d4b7457693de6d0264092ae31004388712c3" exitCode=0 Apr 22 18:23:35.452856 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.452769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z8qr6" event={"ID":"07a60a53-3eab-4583-b7f4-5a08a4917cbc","Type":"ContainerDied","Data":"b6dfba93e8fce725330376730c00d4b7457693de6d0264092ae31004388712c3"} Apr 22 18:23:35.501444 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.501134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.502582 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.502557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ece467a-cc96-4960-a9e1-03c625e246be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7ece467a-cc96-4960-a9e1-03c625e246be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.753554 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.753525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:23:35.890086 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:35.890050 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:23:35.895798 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:35.895766 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ece467a_cc96_4960_a9e1_03c625e246be.slice/crio-e6a9d893fa6ce61332d4e3053053fe1f583367cf7c7987bdc1b86ba349490226 WatchSource:0}: Error finding container e6a9d893fa6ce61332d4e3053053fe1f583367cf7c7987bdc1b86ba349490226: Status 404 returned error can't find the container with id e6a9d893fa6ce61332d4e3053053fe1f583367cf7c7987bdc1b86ba349490226 Apr 22 18:23:36.456681 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.456646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"e6a9d893fa6ce61332d4e3053053fe1f583367cf7c7987bdc1b86ba349490226"} Apr 22 18:23:36.458917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.458887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z8qr6" event={"ID":"07a60a53-3eab-4583-b7f4-5a08a4917cbc","Type":"ContainerStarted","Data":"b6e2265bcf94feb47859d571837593da06875e7db911e10d6443aa299b70c18b"} Apr 22 18:23:36.459016 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.458927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z8qr6" event={"ID":"07a60a53-3eab-4583-b7f4-5a08a4917cbc","Type":"ContainerStarted","Data":"de2607c2cc37cdcfb5640a94bd94ea6ceb7af80baa0344aabe4cf7f8893c21fe"} Apr 22 18:23:36.481624 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.481580 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-z8qr6" podStartSLOduration=2.517107476 podStartE2EDuration="3.481565626s" podCreationTimestamp="2026-04-22 18:23:33 +0000 UTC" firstStartedPulling="2026-04-22 18:23:34.206120672 +0000 UTC m=+177.915466310" lastFinishedPulling="2026-04-22 18:23:35.170578815 +0000 UTC m=+178.879924460" observedRunningTime="2026-04-22 18:23:36.479456052 +0000 UTC m=+180.188801704" watchObservedRunningTime="2026-04-22 18:23:36.481565626 +0000 UTC m=+180.190911276" Apr 22 18:23:36.847877 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.847675 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf"] Apr 22 18:23:36.851801 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.851779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.858021 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.858002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:23:36.859382 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.859361 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mv4sc\"" Apr 22 18:23:36.860286 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.860266 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6r36trc967vr\"" Apr 22 18:23:36.860379 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.860300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:23:36.860636 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.860619 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:23:36.860699 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.860685 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:23:36.861835 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.861809 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:23:36.873164 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.873138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf"] Apr 22 18:23:36.913063 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-tls\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-metrics-client-ca\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-grpc-tls\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqml\" (UniqueName: \"kubernetes.io/projected/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-kube-api-access-fgqml\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:36.913344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:36.913281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.013997 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.013932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.013997 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.013972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-tls\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014170 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-metrics-client-ca\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014170 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-grpc-tls\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014170 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014170 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014170 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqml\" (UniqueName: \"kubernetes.io/projected/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-kube-api-access-fgqml\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014170 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.014938 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.014887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-metrics-client-ca\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.016622 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.016577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.016708 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.016623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.016873 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.016846 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-tls\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.016979 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.016965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.017086 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.017065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.017119 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.017072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-secret-grpc-tls\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.023553 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.023535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqml\" (UniqueName: \"kubernetes.io/projected/a76b1b55-58f0-4b4a-b1f5-24d7623a8c92-kube-api-access-fgqml\") pod \"thanos-querier-6dc466d4b4-2dxrf\" (UID: \"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92\") " pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.206574 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.206551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:37.333829 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.333795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf"] Apr 22 18:23:37.337517 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:37.337486 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda76b1b55_58f0_4b4a_b1f5_24d7623a8c92.slice/crio-9d12d0e3a68934a257bc1d8a1f28854f6568f8923d1f744b40080c00df917833 WatchSource:0}: Error finding container 9d12d0e3a68934a257bc1d8a1f28854f6568f8923d1f744b40080c00df917833: Status 404 returned error can't find the container with id 9d12d0e3a68934a257bc1d8a1f28854f6568f8923d1f744b40080c00df917833 Apr 22 18:23:37.455966 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.455938 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:37.456080 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.455978 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:37.460608 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.460585 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:37.462761 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.462737 2575 generic.go:358] "Generic (PLEG): container finished" podID="7ece467a-cc96-4960-a9e1-03c625e246be" containerID="4a0a14eaaaeb47ca77574f239de7f1cd33d28200c5edfec5402bc70dfc7c5c9f" exitCode=0 Apr 22 18:23:37.462846 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.462816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerDied","Data":"4a0a14eaaaeb47ca77574f239de7f1cd33d28200c5edfec5402bc70dfc7c5c9f"} Apr 22 18:23:37.463902 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.463872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"9d12d0e3a68934a257bc1d8a1f28854f6568f8923d1f744b40080c00df917833"} Apr 22 18:23:37.467649 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.467630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:23:37.551012 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:37.550937 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74cb66595-98tks"] Apr 22 18:23:38.265147 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.265103 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56c6fdc49-tnpfq"] Apr 22 18:23:38.271288 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.271264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.273975 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.273952 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:23:38.276040 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.275100 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:23:38.276040 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.275713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-152nn2tp5hem9\"" Apr 22 18:23:38.276040 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.275821 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:23:38.276040 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.275940 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-wsjfc\"" Apr 22 18:23:38.276346 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.276297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:23:38.278100 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.278065 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56c6fdc49-tnpfq"] Apr 22 18:23:38.326149 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-secret-metrics-server-client-certs\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.326330 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-secret-metrics-server-tls\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.326330 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj7lx\" (UniqueName: \"kubernetes.io/projected/a7691d3c-03b7-43d0-81bc-ac0093b41925-kube-api-access-gj7lx\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.326330 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a7691d3c-03b7-43d0-81bc-ac0093b41925-audit-log\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.326330 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-client-ca-bundle\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.326553 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7691d3c-03b7-43d0-81bc-ac0093b41925-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.326553 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.326419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a7691d3c-03b7-43d0-81bc-ac0093b41925-metrics-server-audit-profiles\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427547 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-secret-metrics-server-client-certs\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-secret-metrics-server-tls\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj7lx\" (UniqueName: \"kubernetes.io/projected/a7691d3c-03b7-43d0-81bc-ac0093b41925-kube-api-access-gj7lx\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a7691d3c-03b7-43d0-81bc-ac0093b41925-audit-log\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-client-ca-bundle\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427920 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7691d3c-03b7-43d0-81bc-ac0093b41925-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.427920 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.427769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a7691d3c-03b7-43d0-81bc-ac0093b41925-metrics-server-audit-profiles\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.428362 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.428305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a7691d3c-03b7-43d0-81bc-ac0093b41925-audit-log\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.429063 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.429022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a7691d3c-03b7-43d0-81bc-ac0093b41925-metrics-server-audit-profiles\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.429572 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.429538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7691d3c-03b7-43d0-81bc-ac0093b41925-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.430750 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.430730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-client-ca-bundle\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.430925 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.430906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-secret-metrics-server-client-certs\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.431493 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.431470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a7691d3c-03b7-43d0-81bc-ac0093b41925-secret-metrics-server-tls\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.436329 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.436304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj7lx\" (UniqueName: \"kubernetes.io/projected/a7691d3c-03b7-43d0-81bc-ac0093b41925-kube-api-access-gj7lx\") pod \"metrics-server-56c6fdc49-tnpfq\" (UID: \"a7691d3c-03b7-43d0-81bc-ac0093b41925\") " pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.554523 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.554442 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6"] Apr 22 18:23:38.558284 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.558263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:38.561216 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.561140 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:23:38.561216 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.561185 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-tmpq4\"" Apr 22 18:23:38.566598 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.566576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6"] Apr 22 18:23:38.585149 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.585085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:38.630277 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.630236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfnt6\" (UID: \"fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:38.731152 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:38.731115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfnt6\" (UID: \"fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:38.731351 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:38.731301 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:23:38.731425 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:23:38.731379 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1-monitoring-plugin-cert podName:fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:39.231358928 +0000 UTC m=+182.940704583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-kfnt6" (UID: "fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1") : secret "monitoring-plugin-cert" not found Apr 22 18:23:39.090021 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.089985 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-74cccdf67f-vn6dv"] Apr 22 18:23:39.093672 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.093651 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.096627 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.096602 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:23:39.096734 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.096604 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:23:39.096734 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.096726 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-p2mck\"" Apr 22 18:23:39.096836 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.096780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:23:39.096919 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.096898 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:23:39.097024 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.096905 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:23:39.103520 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.103501 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:23:39.110509 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.110473 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74cccdf67f-vn6dv"] Apr 22 18:23:39.236415 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-secret-telemeter-client\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236574 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-metrics-client-ca\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236574 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236574 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-serving-certs-ca-bundle\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236574 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-federate-client-tls\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236702 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfnt6\" (UID: \"fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:39.236737 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcsb\" (UniqueName: \"kubernetes.io/projected/13720c60-d2a8-4710-949a-93943bbd1473-kube-api-access-2vcsb\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236800 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-telemeter-client-tls\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.236873 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.236857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.241724 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.241696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kfnt6\" (UID: \"fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:39.337684 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.337647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcsb\" (UniqueName: \"kubernetes.io/projected/13720c60-d2a8-4710-949a-93943bbd1473-kube-api-access-2vcsb\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.337719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-telemeter-client-tls\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.337771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.337820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-secret-telemeter-client\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.337842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-metrics-client-ca\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338073 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.338044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338373 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.338093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-serving-certs-ca-bundle\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338373 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.338128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-federate-client-tls\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338636 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.338612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-metrics-client-ca\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338848 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.338822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-serving-certs-ca-bundle\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.338937 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.338844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13720c60-d2a8-4710-949a-93943bbd1473-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.340884 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.340830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.340996 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.340976 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-secret-telemeter-client\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.341056 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.340997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-federate-client-tls\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.341116 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.341052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/13720c60-d2a8-4710-949a-93943bbd1473-telemeter-client-tls\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.346104 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.346084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcsb\" (UniqueName: \"kubernetes.io/projected/13720c60-d2a8-4710-949a-93943bbd1473-kube-api-access-2vcsb\") pod \"telemeter-client-74cccdf67f-vn6dv\" (UID: \"13720c60-d2a8-4710-949a-93943bbd1473\") " pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.406027 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.405995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" Apr 22 18:23:39.473540 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.471745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:39.475715 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.475672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"e707214a82440ddecb8fb8ceeaf72a6e5ff97959fac149922ccc65f2e8573263"} Apr 22 18:23:39.477744 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.477702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"0fd2f6059ca9427ad9ec66d5274e65c1c8bc9d2ba725c8e722fee205cfd4c77f"} Apr 22 18:23:39.497334 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.494733 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56c6fdc49-tnpfq"] Apr 22 18:23:39.504697 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:39.504665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7691d3c_03b7_43d0_81bc_ac0093b41925.slice/crio-e06b6f6d66f2019375344e90066547a9ce26c824105bb73832aa96cd7ed3317f WatchSource:0}: Error finding container e06b6f6d66f2019375344e90066547a9ce26c824105bb73832aa96cd7ed3317f: Status 404 returned error can't find the container with id e06b6f6d66f2019375344e90066547a9ce26c824105bb73832aa96cd7ed3317f Apr 22 18:23:39.594559 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.593904 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74cccdf67f-vn6dv"] Apr 22 18:23:39.600185 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:39.600139 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13720c60_d2a8_4710_949a_93943bbd1473.slice/crio-ec12d6b33db7dc0e105bcd3f537c2ad3c6bcde5f3467579b3214ff9b00f2b943 WatchSource:0}: Error finding container ec12d6b33db7dc0e105bcd3f537c2ad3c6bcde5f3467579b3214ff9b00f2b943: Status 404 returned error can't find the container with id ec12d6b33db7dc0e105bcd3f537c2ad3c6bcde5f3467579b3214ff9b00f2b943 Apr 22 18:23:39.633977 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.633956 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6"] Apr 22 18:23:39.637277 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:39.637224 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa69e6f8_c5fc_4048_8b7f_2c2fb289e4f1.slice/crio-147edf66843103c4204b85169ec12ae16f8dbc6ca66dcc01af6ee81ecced5a26 WatchSource:0}: Error finding container 147edf66843103c4204b85169ec12ae16f8dbc6ca66dcc01af6ee81ecced5a26: Status 404 returned error can't find the container with id 147edf66843103c4204b85169ec12ae16f8dbc6ca66dcc01af6ee81ecced5a26 Apr 22 18:23:39.939770 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.939707 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8b48b7f86-5vvkc"] Apr 22 18:23:39.942994 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.942979 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:39.953313 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:39.953290 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8b48b7f86-5vvkc"] Apr 22 18:23:40.044330 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-oauth-config\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.044484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-serving-cert\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.044484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-oauth-serving-cert\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.044484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-trusted-ca-bundle\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.044484 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-config\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.044661 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvn82\" (UniqueName: \"kubernetes.io/projected/ddcaac36-8156-41ee-896a-399a1eb08f6c-kube-api-access-gvn82\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.044661 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.044554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-service-ca\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.145675 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-config\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.145848 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvn82\" (UniqueName: \"kubernetes.io/projected/ddcaac36-8156-41ee-896a-399a1eb08f6c-kube-api-access-gvn82\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.145848 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-service-ca\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.145848 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-oauth-config\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.145848 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-serving-cert\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.146060 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-oauth-serving-cert\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.146060 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.145950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-trusted-ca-bundle\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.146569 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.146527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-config\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.146712 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.146577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-service-ca\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.147112 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.147085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-trusted-ca-bundle\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.147506 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.147480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-oauth-serving-cert\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.149325 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.149284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-oauth-config\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.149541 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.149519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-serving-cert\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.153741 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.153703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvn82\" (UniqueName: \"kubernetes.io/projected/ddcaac36-8156-41ee-896a-399a1eb08f6c-kube-api-access-gvn82\") pod \"console-8b48b7f86-5vvkc\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.252561 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.252099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:40.449406 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.448864 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8b48b7f86-5vvkc"] Apr 22 18:23:40.451471 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:40.451446 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddcaac36_8156_41ee_896a_399a1eb08f6c.slice/crio-3373d39fc1afc8a466ef7ae18b35cae4252b3c503bf8dc87320b20e05b1afc00 WatchSource:0}: Error finding container 3373d39fc1afc8a466ef7ae18b35cae4252b3c503bf8dc87320b20e05b1afc00: Status 404 returned error can't find the container with id 3373d39fc1afc8a466ef7ae18b35cae4252b3c503bf8dc87320b20e05b1afc00 Apr 22 18:23:40.486651 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.486619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"0a9e156d098f2f6a884a7824b60390f8e1bf9629c19393f23d430a436363a001"} Apr 22 18:23:40.486764 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.486655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"1251a39c2c2c81044c5dfc21f9d9359dbfbf1dc40b639434803011a31746b991"} Apr 22 18:23:40.486764 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.486671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"dfbad84b20a00726aa4dfc30a365e32fe4353823c9bf44d933f1a9bb3fdbeac7"} Apr 22 18:23:40.488404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.488345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b48b7f86-5vvkc" event={"ID":"ddcaac36-8156-41ee-896a-399a1eb08f6c","Type":"ContainerStarted","Data":"3373d39fc1afc8a466ef7ae18b35cae4252b3c503bf8dc87320b20e05b1afc00"} Apr 22 18:23:40.489785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.489730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" event={"ID":"fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1","Type":"ContainerStarted","Data":"147edf66843103c4204b85169ec12ae16f8dbc6ca66dcc01af6ee81ecced5a26"} Apr 22 18:23:40.492014 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.491747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" event={"ID":"13720c60-d2a8-4710-949a-93943bbd1473","Type":"ContainerStarted","Data":"ec12d6b33db7dc0e105bcd3f537c2ad3c6bcde5f3467579b3214ff9b00f2b943"} Apr 22 18:23:40.492988 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.492961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" event={"ID":"a7691d3c-03b7-43d0-81bc-ac0093b41925","Type":"ContainerStarted","Data":"e06b6f6d66f2019375344e90066547a9ce26c824105bb73832aa96cd7ed3317f"} Apr 22 18:23:40.497041 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.496299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"e849aa55033dc5607f5061154c9ec46f6b6be9277acd9b0c3e7825ea641fb13b"} Apr 22 18:23:40.497041 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.496325 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"89d7a902ccb1e420ffe48b14a5e3948266597b8446af9ac45540f7e8e09a5f25"} Apr 22 18:23:40.497041 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.496340 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"980ffe7e331fe9f8483cc72b03f5d791defcadc5ab298e004a207d9c45946971"} Apr 22 18:23:40.497041 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.496352 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"e018341e4e921b1f9c33d18d4e3cc3b4ee4f892150ff9085be5e25b800f85ed5"} Apr 22 18:23:40.497041 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.496364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7ece467a-cc96-4960-a9e1-03c625e246be","Type":"ContainerStarted","Data":"ee429057edaecdb9519e4c28af035f6a7a45cbefbc1c5efd1d70c281af58a06a"} Apr 22 18:23:40.526228 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.526134 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.117855106 podStartE2EDuration="6.526118362s" podCreationTimestamp="2026-04-22 18:23:34 +0000 UTC" firstStartedPulling="2026-04-22 18:23:35.897776764 +0000 UTC m=+179.607122396" lastFinishedPulling="2026-04-22 18:23:40.306040008 +0000 UTC m=+184.015385652" observedRunningTime="2026-04-22 18:23:40.52292061 +0000 UTC m=+184.232266298" watchObservedRunningTime="2026-04-22 18:23:40.526118362 +0000 UTC m=+184.235464017" Apr 22 18:23:40.819226 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:40.815268 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-646b958546-7n687"] Apr 22 18:23:41.502526 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:41.502486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"60286ac46d2b68b2321d65de0468e7556f442bfd743072c26a3d66c81bcc5d8b"} Apr 22 18:23:41.502526 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:41.502528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" event={"ID":"a76b1b55-58f0-4b4a-b1f5-24d7623a8c92","Type":"ContainerStarted","Data":"32bba4d86dd2fb85600950df94447e4213cd11df502dbfb0c88cfa4d9f90f165"} Apr 22 18:23:41.502967 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:41.502677 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:41.504022 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:41.503997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b48b7f86-5vvkc" event={"ID":"ddcaac36-8156-41ee-896a-399a1eb08f6c","Type":"ContainerStarted","Data":"e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0"} Apr 22 18:23:41.551469 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:41.551416 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" podStartSLOduration=2.585887237 podStartE2EDuration="5.55139724s" podCreationTimestamp="2026-04-22 18:23:36 +0000 UTC" firstStartedPulling="2026-04-22 18:23:37.339608351 +0000 UTC m=+181.048953981" lastFinishedPulling="2026-04-22 18:23:40.305118337 +0000 UTC m=+184.014463984" observedRunningTime="2026-04-22 18:23:41.549102108 +0000 UTC m=+185.258447773" watchObservedRunningTime="2026-04-22 18:23:41.55139724 +0000 UTC m=+185.260742917" Apr 22 18:23:41.596739 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:41.596690 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8b48b7f86-5vvkc" podStartSLOduration=2.596670945 podStartE2EDuration="2.596670945s" podCreationTimestamp="2026-04-22 18:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:41.594720597 +0000 UTC m=+185.304066249" watchObservedRunningTime="2026-04-22 18:23:41.596670945 +0000 UTC m=+185.306016599" Apr 22 18:23:42.508123 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.508084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" event={"ID":"fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1","Type":"ContainerStarted","Data":"16a5eab3f1201986e2d7bbc4007deb60716189694a6dfe59e4bd13a831267c9a"} Apr 22 18:23:42.508586 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.508288 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:42.510270 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.510225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" event={"ID":"13720c60-d2a8-4710-949a-93943bbd1473","Type":"ContainerStarted","Data":"ae5e5aca0b9f7281e77b6e9bff9cb6bc6d318d52c86180b8ff90dc2a964fc2ce"} Apr 22 18:23:42.510410 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.510272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" event={"ID":"13720c60-d2a8-4710-949a-93943bbd1473","Type":"ContainerStarted","Data":"3eddb5bb5a64820dae4578bfb06046b7104ac440f8d655f2a004fd94bc899bb6"} Apr 22 18:23:42.510410 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.510288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" event={"ID":"13720c60-d2a8-4710-949a-93943bbd1473","Type":"ContainerStarted","Data":"a3fb29ca78c32504be1b59b4151baf1dcd71710ad15ca31f53baeb3c74a23954"} Apr 22 18:23:42.511727 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.511701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" event={"ID":"a7691d3c-03b7-43d0-81bc-ac0093b41925","Type":"ContainerStarted","Data":"3844740b9477ed01b6eb47716644bfe550620a3953521bdae838e2a13fbcc68d"} Apr 22 18:23:42.513703 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.513667 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" Apr 22 18:23:42.527021 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.526982 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kfnt6" podStartSLOduration=2.237661976 podStartE2EDuration="4.5269701s" podCreationTimestamp="2026-04-22 18:23:38 +0000 UTC" firstStartedPulling="2026-04-22 18:23:39.639487626 +0000 UTC m=+183.348833257" lastFinishedPulling="2026-04-22 18:23:41.928795738 +0000 UTC m=+185.638141381" observedRunningTime="2026-04-22 18:23:42.525817569 +0000 UTC m=+186.235163223" watchObservedRunningTime="2026-04-22 18:23:42.5269701 +0000 UTC m=+186.236315786" Apr 22 18:23:42.547930 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.547891 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-74cccdf67f-vn6dv" podStartSLOduration=1.218104847 podStartE2EDuration="3.54787823s" podCreationTimestamp="2026-04-22 18:23:39 +0000 UTC" firstStartedPulling="2026-04-22 18:23:39.603751546 +0000 UTC m=+183.313097182" lastFinishedPulling="2026-04-22 18:23:41.933524928 +0000 UTC m=+185.642870565" observedRunningTime="2026-04-22 18:23:42.546225621 +0000 UTC m=+186.255571275" watchObservedRunningTime="2026-04-22 18:23:42.54787823 +0000 UTC m=+186.257223883" Apr 22 18:23:42.571459 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:42.571424 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" podStartSLOduration=2.150246897 podStartE2EDuration="4.571413298s" podCreationTimestamp="2026-04-22 18:23:38 +0000 UTC" firstStartedPulling="2026-04-22 18:23:39.507390168 +0000 UTC m=+183.216735818" lastFinishedPulling="2026-04-22 18:23:41.928556573 +0000 UTC m=+185.637902219" observedRunningTime="2026-04-22 18:23:42.569935355 +0000 UTC m=+186.279281010" watchObservedRunningTime="2026-04-22 18:23:42.571413298 +0000 UTC m=+186.280758951" Apr 22 18:23:43.794108 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.794075 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8b48b7f86-5vvkc"] Apr 22 18:23:43.825920 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.825889 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-778487bcb7-k78hf"] Apr 22 18:23:43.829473 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.829450 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.838855 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.838833 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-778487bcb7-k78hf"] Apr 22 18:23:43.989447 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-console-config\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.989613 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgt8\" (UniqueName: \"kubernetes.io/projected/58bc5e1c-4346-4218-87a2-932fb1944c43-kube-api-access-mzgt8\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.989613 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-trusted-ca-bundle\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.989613 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-oauth-serving-cert\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.989613 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-service-ca\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.989769 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-oauth-config\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:43.989769 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:43.989672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-serving-cert\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090663 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-console-config\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090663 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgt8\" (UniqueName: \"kubernetes.io/projected/58bc5e1c-4346-4218-87a2-932fb1944c43-kube-api-access-mzgt8\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090663 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-trusted-ca-bundle\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090871 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-oauth-serving-cert\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090871 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-service-ca\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090871 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-oauth-config\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.090871 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.090869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-serving-cert\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.091478 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.091455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-console-config\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.091548 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.091516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-oauth-serving-cert\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.091612 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.091556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-service-ca\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.091612 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.091558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-trusted-ca-bundle\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.093370 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.093351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-serving-cert\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.093438 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.093383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-oauth-config\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.098572 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.098552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgt8\" (UniqueName: \"kubernetes.io/projected/58bc5e1c-4346-4218-87a2-932fb1944c43-kube-api-access-mzgt8\") pod \"console-778487bcb7-k78hf\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.138568 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.138541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:44.276612 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.276592 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-778487bcb7-k78hf"] Apr 22 18:23:44.278710 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:23:44.278672 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58bc5e1c_4346_4218_87a2_932fb1944c43.slice/crio-0f785bd627b9cadc930c3a6d6502490ad99cab6b6d6940258c8e6180840340c3 WatchSource:0}: Error finding container 0f785bd627b9cadc930c3a6d6502490ad99cab6b6d6940258c8e6180840340c3: Status 404 returned error can't find the container with id 0f785bd627b9cadc930c3a6d6502490ad99cab6b6d6940258c8e6180840340c3 Apr 22 18:23:44.519761 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.519725 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778487bcb7-k78hf" event={"ID":"58bc5e1c-4346-4218-87a2-932fb1944c43","Type":"ContainerStarted","Data":"a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a"} Apr 22 18:23:44.519897 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.519770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778487bcb7-k78hf" event={"ID":"58bc5e1c-4346-4218-87a2-932fb1944c43","Type":"ContainerStarted","Data":"0f785bd627b9cadc930c3a6d6502490ad99cab6b6d6940258c8e6180840340c3"} Apr 22 18:23:44.549261 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:44.549211 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-778487bcb7-k78hf" podStartSLOduration=1.549197923 podStartE2EDuration="1.549197923s" podCreationTimestamp="2026-04-22 18:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:23:44.54760892 +0000 UTC m=+188.256954572" watchObservedRunningTime="2026-04-22 18:23:44.549197923 +0000 UTC m=+188.258543576" Apr 22 18:23:47.518420 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:47.518388 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6dc466d4b4-2dxrf" Apr 22 18:23:50.253222 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:50.253185 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:23:50.824695 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:50.824666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:23:54.139210 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:54.139174 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:54.139614 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:54.139344 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:54.144013 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:54.143989 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:54.559150 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:54.559122 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:23:54.609118 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:54.609090 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66886898c8-6gkqw"] Apr 22 18:23:58.585757 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:58.585725 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:23:58.586206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:23:58.585767 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:24:02.574982 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.574943 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74cb66595-98tks" podUID="6d4435e6-06eb-4d24-b614-d45e57fb704f" containerName="console" containerID="cri-o://0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b" gracePeriod=15 Apr 22 18:24:02.810584 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.810563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74cb66595-98tks_6d4435e6-06eb-4d24-b614-d45e57fb704f/console/0.log" Apr 22 18:24:02.810687 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.810639 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb66595-98tks" Apr 22 18:24:02.853502 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853479 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-config\") pod \"6d4435e6-06eb-4d24-b614-d45e57fb704f\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " Apr 22 18:24:02.853624 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853524 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-oauth-config\") pod \"6d4435e6-06eb-4d24-b614-d45e57fb704f\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " Apr 22 18:24:02.853624 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853560 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-oauth-serving-cert\") pod \"6d4435e6-06eb-4d24-b614-d45e57fb704f\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " Apr 22 18:24:02.853624 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853586 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-service-ca\") pod \"6d4435e6-06eb-4d24-b614-d45e57fb704f\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " Apr 22 18:24:02.853624 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-serving-cert\") pod \"6d4435e6-06eb-4d24-b614-d45e57fb704f\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " Apr 22 18:24:02.853828 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853649 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6djvp\" (UniqueName: \"kubernetes.io/projected/6d4435e6-06eb-4d24-b614-d45e57fb704f-kube-api-access-6djvp\") pod \"6d4435e6-06eb-4d24-b614-d45e57fb704f\" (UID: \"6d4435e6-06eb-4d24-b614-d45e57fb704f\") " Apr 22 18:24:02.853979 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.853950 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-config" (OuterVolumeSpecName: "console-config") pod "6d4435e6-06eb-4d24-b614-d45e57fb704f" (UID: "6d4435e6-06eb-4d24-b614-d45e57fb704f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:02.854095 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.854076 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-service-ca" (OuterVolumeSpecName: "service-ca") pod "6d4435e6-06eb-4d24-b614-d45e57fb704f" (UID: "6d4435e6-06eb-4d24-b614-d45e57fb704f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:02.854290 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.854264 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6d4435e6-06eb-4d24-b614-d45e57fb704f" (UID: "6d4435e6-06eb-4d24-b614-d45e57fb704f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:02.867489 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.867463 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6d4435e6-06eb-4d24-b614-d45e57fb704f" (UID: "6d4435e6-06eb-4d24-b614-d45e57fb704f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:02.880545 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.880516 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6d4435e6-06eb-4d24-b614-d45e57fb704f" (UID: "6d4435e6-06eb-4d24-b614-d45e57fb704f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:02.880627 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.880544 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4435e6-06eb-4d24-b614-d45e57fb704f-kube-api-access-6djvp" (OuterVolumeSpecName: "kube-api-access-6djvp") pod "6d4435e6-06eb-4d24-b614-d45e57fb704f" (UID: "6d4435e6-06eb-4d24-b614-d45e57fb704f"). InnerVolumeSpecName "kube-api-access-6djvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:02.954283 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.954232 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:02.954283 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.954278 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6djvp\" (UniqueName: \"kubernetes.io/projected/6d4435e6-06eb-4d24-b614-d45e57fb704f-kube-api-access-6djvp\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:02.954396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.954290 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:02.954396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.954298 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d4435e6-06eb-4d24-b614-d45e57fb704f-console-oauth-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:02.954396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.954309 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-oauth-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:02.954396 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:02.954318 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d4435e6-06eb-4d24-b614-d45e57fb704f-service-ca\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:03.586591 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.586565 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74cb66595-98tks_6d4435e6-06eb-4d24-b614-d45e57fb704f/console/0.log" Apr 22 18:24:03.587051 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.586606 2575 generic.go:358] "Generic (PLEG): container finished" podID="6d4435e6-06eb-4d24-b614-d45e57fb704f" containerID="0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b" exitCode=2 Apr 22 18:24:03.587051 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.586644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb66595-98tks" event={"ID":"6d4435e6-06eb-4d24-b614-d45e57fb704f","Type":"ContainerDied","Data":"0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b"} Apr 22 18:24:03.587051 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.586682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74cb66595-98tks" event={"ID":"6d4435e6-06eb-4d24-b614-d45e57fb704f","Type":"ContainerDied","Data":"f2779a33196b055c070212ca4b4b116974709c27e292fdbd6cae4e1a2c351780"} Apr 22 18:24:03.587051 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.586703 2575 scope.go:117] "RemoveContainer" containerID="0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b" Apr 22 18:24:03.587051 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.586722 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74cb66595-98tks" Apr 22 18:24:03.588567 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.588534 2575 generic.go:358] "Generic (PLEG): container finished" podID="ad8365e0-e003-4937-9dbd-1989580ac1f4" containerID="f6c435a7c2e7c7b64e45cf7e1ddfddcc6911498ea9b4ee04a34294f9132a765a" exitCode=0 Apr 22 18:24:03.588676 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.588587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" event={"ID":"ad8365e0-e003-4937-9dbd-1989580ac1f4","Type":"ContainerDied","Data":"f6c435a7c2e7c7b64e45cf7e1ddfddcc6911498ea9b4ee04a34294f9132a765a"} Apr 22 18:24:03.589019 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.588990 2575 scope.go:117] "RemoveContainer" containerID="f6c435a7c2e7c7b64e45cf7e1ddfddcc6911498ea9b4ee04a34294f9132a765a" Apr 22 18:24:03.596362 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.596340 2575 scope.go:117] "RemoveContainer" containerID="0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b" Apr 22 18:24:03.596710 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:24:03.596684 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b\": container with ID starting with 0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b not found: ID does not exist" containerID="0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b" Apr 22 18:24:03.596783 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.596719 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b"} err="failed to get container status \"0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b\": rpc error: code = NotFound desc = could not find container \"0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b\": container with ID starting with 0cc98a9ba866705750b4e952607cf07baef1a0a7b3b7d3e8914ba269f1fb0f0b not found: ID does not exist" Apr 22 18:24:03.627436 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.627413 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74cb66595-98tks"] Apr 22 18:24:03.632908 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:03.632888 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74cb66595-98tks"] Apr 22 18:24:04.593197 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:04.593162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-9ndl7" event={"ID":"ad8365e0-e003-4937-9dbd-1989580ac1f4","Type":"ContainerStarted","Data":"3ec0f1c37ea26e901e463d81d84ce5050772c2fa52bc9a5b32e831529756b406"} Apr 22 18:24:04.910923 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:04.910825 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4435e6-06eb-4d24-b614-d45e57fb704f" path="/var/lib/kubelet/pods/6d4435e6-06eb-4d24-b614-d45e57fb704f/volumes" Apr 22 18:24:05.853923 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:05.853880 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-646b958546-7n687" podUID="1add8823-8c00-46ae-a8af-828b95cc217f" containerName="registry" containerID="cri-o://b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1" gracePeriod=30 Apr 22 18:24:06.096366 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.096346 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:24:06.178781 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178723 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpg9l\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-kube-api-access-rpg9l\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.178781 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178759 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-registry-certificates\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.178781 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178778 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-image-registry-private-configuration\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.178968 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178816 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-bound-sa-token\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.178968 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178851 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-installation-pull-secrets\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.178968 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178908 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1add8823-8c00-46ae-a8af-828b95cc217f-ca-trust-extracted\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.178968 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178944 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-trusted-ca\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.179163 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.178977 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") pod \"1add8823-8c00-46ae-a8af-828b95cc217f\" (UID: \"1add8823-8c00-46ae-a8af-828b95cc217f\") " Apr 22 18:24:06.179372 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.179201 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:06.179717 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.179476 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:06.181299 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.181233 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:06.181404 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.181307 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-kube-api-access-rpg9l" (OuterVolumeSpecName: "kube-api-access-rpg9l") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "kube-api-access-rpg9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:06.181478 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.181456 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:06.181605 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.181575 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:06.181723 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.181678 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:06.187695 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.187672 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1add8823-8c00-46ae-a8af-828b95cc217f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1add8823-8c00-46ae-a8af-828b95cc217f" (UID: "1add8823-8c00-46ae-a8af-828b95cc217f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:06.280224 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280202 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1add8823-8c00-46ae-a8af-828b95cc217f-ca-trust-extracted\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280224 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280225 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-trusted-ca\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280357 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280236 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-registry-tls\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280357 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280256 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpg9l\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-kube-api-access-rpg9l\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280357 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280267 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1add8823-8c00-46ae-a8af-828b95cc217f-registry-certificates\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280357 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280278 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-image-registry-private-configuration\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280357 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280288 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1add8823-8c00-46ae-a8af-828b95cc217f-bound-sa-token\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.280357 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.280296 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1add8823-8c00-46ae-a8af-828b95cc217f-installation-pull-secrets\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:06.602136 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.602107 2575 generic.go:358] "Generic (PLEG): container finished" podID="1add8823-8c00-46ae-a8af-828b95cc217f" containerID="b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1" exitCode=0 Apr 22 18:24:06.602294 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.602196 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-646b958546-7n687" Apr 22 18:24:06.602294 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.602221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-646b958546-7n687" event={"ID":"1add8823-8c00-46ae-a8af-828b95cc217f","Type":"ContainerDied","Data":"b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1"} Apr 22 18:24:06.602294 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.602283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-646b958546-7n687" event={"ID":"1add8823-8c00-46ae-a8af-828b95cc217f","Type":"ContainerDied","Data":"bc227359faf1ec867e8daf21155bdf2409d6177c1080f2ea68aa268346a078f5"} Apr 22 18:24:06.602456 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.602306 2575 scope.go:117] "RemoveContainer" containerID="b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1" Apr 22 18:24:06.612020 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.611993 2575 scope.go:117] "RemoveContainer" containerID="b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1" Apr 22 18:24:06.614017 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:24:06.613750 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1\": container with ID starting with b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1 not found: ID does not exist" containerID="b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1" Apr 22 18:24:06.614095 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.614027 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1"} err="failed to get container status \"b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1\": rpc error: code = NotFound desc = could not find container \"b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1\": container with ID starting with b5253358703b175f4be5c76571aa35992fcc46eee2708a882a590134b8d688f1 not found: ID does not exist" Apr 22 18:24:06.630643 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.630620 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-646b958546-7n687"] Apr 22 18:24:06.638568 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.638546 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-646b958546-7n687"] Apr 22 18:24:06.910018 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:06.909956 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1add8823-8c00-46ae-a8af-828b95cc217f" path="/var/lib/kubelet/pods/1add8823-8c00-46ae-a8af-828b95cc217f/volumes" Apr 22 18:24:08.812422 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:08.812381 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8b48b7f86-5vvkc" podUID="ddcaac36-8156-41ee-896a-399a1eb08f6c" containerName="console" containerID="cri-o://e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0" gracePeriod=15 Apr 22 18:24:09.045896 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.045874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8b48b7f86-5vvkc_ddcaac36-8156-41ee-896a-399a1eb08f6c/console/0.log" Apr 22 18:24:09.045993 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.045934 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:24:09.100878 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.100817 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-serving-cert\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.100878 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.100854 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-service-ca\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.101033 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.100909 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-oauth-config\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.101033 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.100934 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-oauth-serving-cert\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.101033 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvn82\" (UniqueName: \"kubernetes.io/projected/ddcaac36-8156-41ee-896a-399a1eb08f6c-kube-api-access-gvn82\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.101180 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101056 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-trusted-ca-bundle\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.101180 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101152 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-config\") pod \"ddcaac36-8156-41ee-896a-399a1eb08f6c\" (UID: \"ddcaac36-8156-41ee-896a-399a1eb08f6c\") " Apr 22 18:24:09.101385 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101359 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:09.101449 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101396 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:09.101507 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101450 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:09.101617 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101601 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-trusted-ca-bundle\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.101675 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101624 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-service-ca\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.101675 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101638 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-oauth-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.101742 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.101678 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-config" (OuterVolumeSpecName: "console-config") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:09.103079 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.103057 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddcaac36-8156-41ee-896a-399a1eb08f6c-kube-api-access-gvn82" (OuterVolumeSpecName: "kube-api-access-gvn82") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "kube-api-access-gvn82". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:09.103374 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.103353 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:09.103441 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.103379 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ddcaac36-8156-41ee-896a-399a1eb08f6c" (UID: "ddcaac36-8156-41ee-896a-399a1eb08f6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:09.202724 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.202697 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvn82\" (UniqueName: \"kubernetes.io/projected/ddcaac36-8156-41ee-896a-399a1eb08f6c-kube-api-access-gvn82\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.202724 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.202720 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.202838 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.202730 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.202838 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.202739 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ddcaac36-8156-41ee-896a-399a1eb08f6c-console-oauth-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:09.613448 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.613424 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8b48b7f86-5vvkc_ddcaac36-8156-41ee-896a-399a1eb08f6c/console/0.log" Apr 22 18:24:09.613578 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.613469 2575 generic.go:358] "Generic (PLEG): container finished" podID="ddcaac36-8156-41ee-896a-399a1eb08f6c" containerID="e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0" exitCode=2 Apr 22 18:24:09.613578 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.613518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b48b7f86-5vvkc" event={"ID":"ddcaac36-8156-41ee-896a-399a1eb08f6c","Type":"ContainerDied","Data":"e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0"} Apr 22 18:24:09.613578 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.613533 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b48b7f86-5vvkc" Apr 22 18:24:09.613578 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.613554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b48b7f86-5vvkc" event={"ID":"ddcaac36-8156-41ee-896a-399a1eb08f6c","Type":"ContainerDied","Data":"3373d39fc1afc8a466ef7ae18b35cae4252b3c503bf8dc87320b20e05b1afc00"} Apr 22 18:24:09.613768 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.613578 2575 scope.go:117] "RemoveContainer" containerID="e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0" Apr 22 18:24:09.622179 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.622163 2575 scope.go:117] "RemoveContainer" containerID="e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0" Apr 22 18:24:09.622471 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:24:09.622450 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0\": container with ID starting with e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0 not found: ID does not exist" containerID="e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0" Apr 22 18:24:09.622522 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.622480 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0"} err="failed to get container status \"e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0\": rpc error: code = NotFound desc = could not find container \"e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0\": container with ID starting with e8cdba613ca7440f16cd6385e51976bf02678573c488f77e0c313e8123a157a0 not found: ID does not exist" Apr 22 18:24:09.635732 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.635712 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8b48b7f86-5vvkc"] Apr 22 18:24:09.639519 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:09.639497 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8b48b7f86-5vvkc"] Apr 22 18:24:10.910462 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:10.910425 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddcaac36-8156-41ee-896a-399a1eb08f6c" path="/var/lib/kubelet/pods/ddcaac36-8156-41ee-896a-399a1eb08f6c/volumes" Apr 22 18:24:18.591003 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:18.590974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:24:18.594992 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:18.594974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56c6fdc49-tnpfq" Apr 22 18:24:19.628518 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.628454 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66886898c8-6gkqw" podUID="eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" containerName="console" containerID="cri-o://e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857" gracePeriod=15 Apr 22 18:24:19.866960 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.866939 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66886898c8-6gkqw_eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d/console/0.log" Apr 22 18:24:19.867067 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.866996 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:24:19.980667 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980609 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-oauth-config\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.980667 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980649 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-trusted-ca-bundle\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.980667 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980667 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-config\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.980890 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980711 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-service-ca\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.980890 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980760 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-oauth-serving-cert\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.980890 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980782 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-serving-cert\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.980890 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.980802 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft26h\" (UniqueName: \"kubernetes.io/projected/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-kube-api-access-ft26h\") pod \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\" (UID: \"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d\") " Apr 22 18:24:19.981114 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.981089 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:19.981177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.981117 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-service-ca" (OuterVolumeSpecName: "service-ca") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:19.981177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.981153 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-config" (OuterVolumeSpecName: "console-config") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:19.981177 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.981162 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:24:19.982802 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.982780 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:19.982898 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.982826 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:19.982898 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:19.982880 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-kube-api-access-ft26h" (OuterVolumeSpecName: "kube-api-access-ft26h") pod "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" (UID: "eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d"). InnerVolumeSpecName "kube-api-access-ft26h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:20.082084 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082059 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-oauth-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.082084 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082080 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.082206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082090 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ft26h\" (UniqueName: \"kubernetes.io/projected/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-kube-api-access-ft26h\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.082206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082100 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-oauth-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.082206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082110 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-trusted-ca-bundle\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.082206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082119 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-console-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.082206 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.082129 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d-service-ca\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:24:20.649222 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.649193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66886898c8-6gkqw_eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d/console/0.log" Apr 22 18:24:20.649748 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.649234 2575 generic.go:358] "Generic (PLEG): container finished" podID="eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" containerID="e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857" exitCode=2 Apr 22 18:24:20.649748 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.649354 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66886898c8-6gkqw" event={"ID":"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d","Type":"ContainerDied","Data":"e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857"} Apr 22 18:24:20.649748 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.649378 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66886898c8-6gkqw" Apr 22 18:24:20.649748 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.649395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66886898c8-6gkqw" event={"ID":"eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d","Type":"ContainerDied","Data":"0cd79a88173f51ce3aead0fe13829b85e69d2e418a474ada92f1173e5d2cfe20"} Apr 22 18:24:20.649748 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.649413 2575 scope.go:117] "RemoveContainer" containerID="e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857" Apr 22 18:24:20.659392 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.659375 2575 scope.go:117] "RemoveContainer" containerID="e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857" Apr 22 18:24:20.659680 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:24:20.659659 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857\": container with ID starting with e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857 not found: ID does not exist" containerID="e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857" Apr 22 18:24:20.659733 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.659688 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857"} err="failed to get container status \"e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857\": rpc error: code = NotFound desc = could not find container \"e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857\": container with ID starting with e5198dcad07dc5f713dda6c061a6df5b49a8d60437b72cd94ec1255f0b056857 not found: ID does not exist" Apr 22 18:24:20.671988 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.671964 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66886898c8-6gkqw"] Apr 22 18:24:20.674700 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.674674 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66886898c8-6gkqw"] Apr 22 18:24:20.910268 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:20.910182 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" path="/var/lib/kubelet/pods/eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d/volumes" Apr 22 18:24:48.635375 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:48.635329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:24:48.637865 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:48.637843 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1f085cfa-07bb-457b-85ce-79f190f3ecb1-metrics-certs\") pod \"network-metrics-daemon-dhwbm\" (UID: \"1f085cfa-07bb-457b-85ce-79f190f3ecb1\") " pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:24:48.709647 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:48.709608 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fll8t\"" Apr 22 18:24:48.717810 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:48.717782 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dhwbm" Apr 22 18:24:48.852871 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:48.852834 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dhwbm"] Apr 22 18:24:48.856540 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:24:48.856502 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f085cfa_07bb_457b_85ce_79f190f3ecb1.slice/crio-96c13eb9399a206c1d2688ce7a0e8309d39bcdd22c04154d60a533bbb899224d WatchSource:0}: Error finding container 96c13eb9399a206c1d2688ce7a0e8309d39bcdd22c04154d60a533bbb899224d: Status 404 returned error can't find the container with id 96c13eb9399a206c1d2688ce7a0e8309d39bcdd22c04154d60a533bbb899224d Apr 22 18:24:49.743573 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:49.743525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhwbm" event={"ID":"1f085cfa-07bb-457b-85ce-79f190f3ecb1","Type":"ContainerStarted","Data":"96c13eb9399a206c1d2688ce7a0e8309d39bcdd22c04154d60a533bbb899224d"} Apr 22 18:24:50.748521 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:50.748478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhwbm" event={"ID":"1f085cfa-07bb-457b-85ce-79f190f3ecb1","Type":"ContainerStarted","Data":"ba3fa865febe2cac509bf65ee22d0cf55d0d5bb436c04706983102402e609947"} Apr 22 18:24:50.748521 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:50.748524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dhwbm" event={"ID":"1f085cfa-07bb-457b-85ce-79f190f3ecb1","Type":"ContainerStarted","Data":"1536dfcaf5ddb23115bc8a1bae7917510ed5a41b48f55d70c591a3786f20f833"} Apr 22 18:24:50.767424 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:24:50.767360 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dhwbm" podStartSLOduration=252.814692701 podStartE2EDuration="4m13.767342403s" podCreationTimestamp="2026-04-22 18:20:37 +0000 UTC" firstStartedPulling="2026-04-22 18:24:48.858316725 +0000 UTC m=+252.567662357" lastFinishedPulling="2026-04-22 18:24:49.810966416 +0000 UTC m=+253.520312059" observedRunningTime="2026-04-22 18:24:50.766545674 +0000 UTC m=+254.475891328" watchObservedRunningTime="2026-04-22 18:24:50.767342403 +0000 UTC m=+254.476688055" Apr 22 18:25:13.456017 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.455936 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b7df59464-7p7pt"] Apr 22 18:25:13.456969 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.456943 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1add8823-8c00-46ae-a8af-828b95cc217f" containerName="registry" Apr 22 18:25:13.457144 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457103 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1add8823-8c00-46ae-a8af-828b95cc217f" containerName="registry" Apr 22 18:25:13.457144 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457133 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" containerName="console" Apr 22 18:25:13.457144 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457143 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457179 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddcaac36-8156-41ee-896a-399a1eb08f6c" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457188 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddcaac36-8156-41ee-896a-399a1eb08f6c" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457211 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d4435e6-06eb-4d24-b614-d45e57fb704f" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457220 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4435e6-06eb-4d24-b614-d45e57fb704f" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457304 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddcaac36-8156-41ee-896a-399a1eb08f6c" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457314 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d4435e6-06eb-4d24-b614-d45e57fb704f" containerName="console" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457324 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1add8823-8c00-46ae-a8af-828b95cc217f" containerName="registry" Apr 22 18:25:13.457384 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.457330 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="eab83ab1-d8c4-4865-8f47-06a4e0e6fc1d" containerName="console" Apr 22 18:25:13.460299 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.460274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.472586 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.472558 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b7df59464-7p7pt"] Apr 22 18:25:13.561453 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-config\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.561592 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-service-ca\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.561592 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-oauth-config\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.561704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-serving-cert\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.561704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-trusted-ca-bundle\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.561704 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-oauth-serving-cert\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.561802 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.561705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j2rr\" (UniqueName: \"kubernetes.io/projected/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-kube-api-access-7j2rr\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662621 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-service-ca\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662734 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-oauth-config\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662734 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-serving-cert\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662734 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-trusted-ca-bundle\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662851 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-oauth-serving-cert\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662851 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j2rr\" (UniqueName: \"kubernetes.io/projected/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-kube-api-access-7j2rr\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.662851 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.662806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-config\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.663434 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.663404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-service-ca\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.663565 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.663445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-oauth-serving-cert\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.663565 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.663505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-config\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.663672 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.663574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-trusted-ca-bundle\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.665111 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.665080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-oauth-config\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.665202 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.665186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-console-serving-cert\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.670789 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.670767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j2rr\" (UniqueName: \"kubernetes.io/projected/3f33e8ba-7723-4eea-91eb-a4ef03bf4f08-kube-api-access-7j2rr\") pod \"console-5b7df59464-7p7pt\" (UID: \"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08\") " pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.770131 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.770109 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:13.897725 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:13.897696 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b7df59464-7p7pt"] Apr 22 18:25:13.899936 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:25:13.899914 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f33e8ba_7723_4eea_91eb_a4ef03bf4f08.slice/crio-4ab8b39003f09884e3b6538803c44974cf6857d5f6983fe8691c832064b33100 WatchSource:0}: Error finding container 4ab8b39003f09884e3b6538803c44974cf6857d5f6983fe8691c832064b33100: Status 404 returned error can't find the container with id 4ab8b39003f09884e3b6538803c44974cf6857d5f6983fe8691c832064b33100 Apr 22 18:25:14.823616 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:14.823576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b7df59464-7p7pt" event={"ID":"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08","Type":"ContainerStarted","Data":"cbf8058cf6939af365e810e189cd049b014de28a68d65f861a2c03e77c8b11f9"} Apr 22 18:25:14.824001 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:14.823615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b7df59464-7p7pt" event={"ID":"3f33e8ba-7723-4eea-91eb-a4ef03bf4f08","Type":"ContainerStarted","Data":"4ab8b39003f09884e3b6538803c44974cf6857d5f6983fe8691c832064b33100"} Apr 22 18:25:14.843136 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:14.843083 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b7df59464-7p7pt" podStartSLOduration=1.8430668479999999 podStartE2EDuration="1.843066848s" podCreationTimestamp="2026-04-22 18:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:25:14.841642684 +0000 UTC m=+278.550988337" watchObservedRunningTime="2026-04-22 18:25:14.843066848 +0000 UTC m=+278.552412501" Apr 22 18:25:23.770766 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:23.770709 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:23.770766 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:23.770774 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:23.776603 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:23.776569 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:23.856174 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:23.856116 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b7df59464-7p7pt" Apr 22 18:25:23.905444 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:23.905384 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-778487bcb7-k78hf"] Apr 22 18:25:36.789610 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:36.789580 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/1.log" Apr 22 18:25:36.790095 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:36.789580 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/1.log" Apr 22 18:25:36.793112 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:36.793083 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:25:36.793320 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:36.793083 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:25:36.800281 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:36.800255 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:25:48.931219 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:48.931152 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-778487bcb7-k78hf" podUID="58bc5e1c-4346-4218-87a2-932fb1944c43" containerName="console" containerID="cri-o://a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a" gracePeriod=15 Apr 22 18:25:49.202411 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.202369 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-778487bcb7-k78hf_58bc5e1c-4346-4218-87a2-932fb1944c43/console/0.log" Apr 22 18:25:49.202657 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.202472 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:25:49.311355 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311233 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-console-config\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.311355 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311356 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-oauth-serving-cert\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.311820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311380 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-service-ca\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.311820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311411 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzgt8\" (UniqueName: \"kubernetes.io/projected/58bc5e1c-4346-4218-87a2-932fb1944c43-kube-api-access-mzgt8\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.311820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311467 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-trusted-ca-bundle\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.311820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311509 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-oauth-config\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.311820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311575 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-serving-cert\") pod \"58bc5e1c-4346-4218-87a2-932fb1944c43\" (UID: \"58bc5e1c-4346-4218-87a2-932fb1944c43\") " Apr 22 18:25:49.312091 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311899 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-console-config" (OuterVolumeSpecName: "console-config") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:25:49.312091 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.311976 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:25:49.312091 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.312024 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-service-ca" (OuterVolumeSpecName: "service-ca") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:25:49.312314 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.312280 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:25:49.314917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.314849 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58bc5e1c-4346-4218-87a2-932fb1944c43-kube-api-access-mzgt8" (OuterVolumeSpecName: "kube-api-access-mzgt8") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "kube-api-access-mzgt8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:25:49.314917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.314843 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:25:49.314917 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.314862 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "58bc5e1c-4346-4218-87a2-932fb1944c43" (UID: "58bc5e1c-4346-4218-87a2-932fb1944c43"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:25:49.412723 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412650 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzgt8\" (UniqueName: \"kubernetes.io/projected/58bc5e1c-4346-4218-87a2-932fb1944c43-kube-api-access-mzgt8\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.412723 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412706 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-trusted-ca-bundle\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.412723 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412717 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-oauth-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.412723 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412727 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/58bc5e1c-4346-4218-87a2-932fb1944c43-console-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.413078 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412757 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-console-config\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.413078 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412768 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-oauth-serving-cert\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.413078 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.412779 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58bc5e1c-4346-4218-87a2-932fb1944c43-service-ca\") on node \"ip-10-0-140-74.ec2.internal\" DevicePath \"\"" Apr 22 18:25:49.955899 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.955832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-778487bcb7-k78hf_58bc5e1c-4346-4218-87a2-932fb1944c43/console/0.log" Apr 22 18:25:49.956785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.955928 2575 generic.go:358] "Generic (PLEG): container finished" podID="58bc5e1c-4346-4218-87a2-932fb1944c43" containerID="a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a" exitCode=2 Apr 22 18:25:49.956785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.956052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778487bcb7-k78hf" event={"ID":"58bc5e1c-4346-4218-87a2-932fb1944c43","Type":"ContainerDied","Data":"a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a"} Apr 22 18:25:49.956785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.956133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778487bcb7-k78hf" event={"ID":"58bc5e1c-4346-4218-87a2-932fb1944c43","Type":"ContainerDied","Data":"0f785bd627b9cadc930c3a6d6502490ad99cab6b6d6940258c8e6180840340c3"} Apr 22 18:25:49.956785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.956159 2575 scope.go:117] "RemoveContainer" containerID="a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a" Apr 22 18:25:49.956785 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.956075 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778487bcb7-k78hf" Apr 22 18:25:49.966190 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.966169 2575 scope.go:117] "RemoveContainer" containerID="a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a" Apr 22 18:25:49.966523 ip-10-0-140-74 kubenswrapper[2575]: E0422 18:25:49.966503 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a\": container with ID starting with a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a not found: ID does not exist" containerID="a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a" Apr 22 18:25:49.966583 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.966534 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a"} err="failed to get container status \"a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a\": rpc error: code = NotFound desc = could not find container \"a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a\": container with ID starting with a0ee5b48a15d0b92ed057558bacd8b0b71c12823caa0e1007cc136b3f4abdb3a not found: ID does not exist" Apr 22 18:25:49.977112 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.977084 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-778487bcb7-k78hf"] Apr 22 18:25:49.980605 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:49.980581 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-778487bcb7-k78hf"] Apr 22 18:25:50.910826 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:50.910785 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58bc5e1c-4346-4218-87a2-932fb1944c43" path="/var/lib/kubelet/pods/58bc5e1c-4346-4218-87a2-932fb1944c43/volumes" Apr 22 18:25:59.546742 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.546707 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5v9l6/must-gather-mm28q"] Apr 22 18:25:59.547158 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.547035 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58bc5e1c-4346-4218-87a2-932fb1944c43" containerName="console" Apr 22 18:25:59.547158 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.547045 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="58bc5e1c-4346-4218-87a2-932fb1944c43" containerName="console" Apr 22 18:25:59.547158 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.547092 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="58bc5e1c-4346-4218-87a2-932fb1944c43" containerName="console" Apr 22 18:25:59.550058 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.550038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.552594 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.552567 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5v9l6\"/\"default-dockercfg-p2l25\"" Apr 22 18:25:59.553690 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.553664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v9l6\"/\"kube-root-ca.crt\"" Apr 22 18:25:59.553818 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.553803 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v9l6\"/\"openshift-service-ca.crt\"" Apr 22 18:25:59.562712 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.562685 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/must-gather-mm28q"] Apr 22 18:25:59.602471 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.602441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vnf\" (UniqueName: \"kubernetes.io/projected/5057743e-a6d6-4452-97e0-5d9fbdceebac-kube-api-access-t2vnf\") pod \"must-gather-mm28q\" (UID: \"5057743e-a6d6-4452-97e0-5d9fbdceebac\") " pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.602658 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.602514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5057743e-a6d6-4452-97e0-5d9fbdceebac-must-gather-output\") pod \"must-gather-mm28q\" (UID: \"5057743e-a6d6-4452-97e0-5d9fbdceebac\") " pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.703693 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.703645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5057743e-a6d6-4452-97e0-5d9fbdceebac-must-gather-output\") pod \"must-gather-mm28q\" (UID: \"5057743e-a6d6-4452-97e0-5d9fbdceebac\") " pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.703693 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.703703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vnf\" (UniqueName: \"kubernetes.io/projected/5057743e-a6d6-4452-97e0-5d9fbdceebac-kube-api-access-t2vnf\") pod \"must-gather-mm28q\" (UID: \"5057743e-a6d6-4452-97e0-5d9fbdceebac\") " pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.704013 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.703992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5057743e-a6d6-4452-97e0-5d9fbdceebac-must-gather-output\") pod \"must-gather-mm28q\" (UID: \"5057743e-a6d6-4452-97e0-5d9fbdceebac\") " pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.712285 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.712259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vnf\" (UniqueName: \"kubernetes.io/projected/5057743e-a6d6-4452-97e0-5d9fbdceebac-kube-api-access-t2vnf\") pod \"must-gather-mm28q\" (UID: \"5057743e-a6d6-4452-97e0-5d9fbdceebac\") " pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.873423 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.873320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/must-gather-mm28q" Apr 22 18:25:59.988615 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.988583 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/must-gather-mm28q"] Apr 22 18:25:59.991470 ip-10-0-140-74 kubenswrapper[2575]: W0422 18:25:59.991443 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5057743e_a6d6_4452_97e0_5d9fbdceebac.slice/crio-7e8898c362195e3eca389c70555f347f727eca685a3aeaaae79bf0d21c02f370 WatchSource:0}: Error finding container 7e8898c362195e3eca389c70555f347f727eca685a3aeaaae79bf0d21c02f370: Status 404 returned error can't find the container with id 7e8898c362195e3eca389c70555f347f727eca685a3aeaaae79bf0d21c02f370 Apr 22 18:25:59.993300 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:25:59.993236 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:26:00.991932 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:00.991857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/must-gather-mm28q" event={"ID":"5057743e-a6d6-4452-97e0-5d9fbdceebac","Type":"ContainerStarted","Data":"9c0ad85f8f5aae38dcc31909b3013db8852826a00825c7d120eec6cbb40d6fb6"} Apr 22 18:26:00.991932 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:00.991904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/must-gather-mm28q" event={"ID":"5057743e-a6d6-4452-97e0-5d9fbdceebac","Type":"ContainerStarted","Data":"7e8898c362195e3eca389c70555f347f727eca685a3aeaaae79bf0d21c02f370"} Apr 22 18:26:01.997905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:01.997865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/must-gather-mm28q" event={"ID":"5057743e-a6d6-4452-97e0-5d9fbdceebac","Type":"ContainerStarted","Data":"5adad055080916a7ab334eb9494b83da08d6a7924d97282d8ba79ae05c272ed2"} Apr 22 18:26:02.016032 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:02.015973 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5v9l6/must-gather-mm28q" podStartSLOduration=2.169695912 podStartE2EDuration="3.015956413s" podCreationTimestamp="2026-04-22 18:25:59 +0000 UTC" firstStartedPulling="2026-04-22 18:25:59.99347268 +0000 UTC m=+323.702818319" lastFinishedPulling="2026-04-22 18:26:00.839733185 +0000 UTC m=+324.549078820" observedRunningTime="2026-04-22 18:26:02.014057238 +0000 UTC m=+325.723402893" watchObservedRunningTime="2026-04-22 18:26:02.015956413 +0000 UTC m=+325.725302066" Apr 22 18:26:02.316736 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:02.316646 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xzh9f_39554817-37b5-4aee-afc9-ec4c204d3d1c/global-pull-secret-syncer/0.log" Apr 22 18:26:02.396645 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:02.396610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fklgp_7d36f1e9-9321-4fbf-935b-023f00adbb68/konnectivity-agent/0.log" Apr 22 18:26:02.501504 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:02.501476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-74.ec2.internal_075634087ab8a39514ea4cf278614518/haproxy/0.log" Apr 22 18:26:05.685344 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.685262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/alertmanager/0.log" Apr 22 18:26:05.711820 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.711779 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/config-reloader/0.log" Apr 22 18:26:05.739528 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.739487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/kube-rbac-proxy-web/0.log" Apr 22 18:26:05.767703 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.767669 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/kube-rbac-proxy/0.log" Apr 22 18:26:05.801329 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.801298 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/kube-rbac-proxy-metric/0.log" Apr 22 18:26:05.828098 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.828071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/prom-label-proxy/0.log" Apr 22 18:26:05.854049 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.853981 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7ece467a-cc96-4960-a9e1-03c625e246be/init-config-reloader/0.log" Apr 22 18:26:05.899046 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:05.899012 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lwbgg_3fd6a1fe-39a4-4ac5-aa49-e7b33a296fc3/cluster-monitoring-operator/0.log" Apr 22 18:26:06.009043 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.009013 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56c6fdc49-tnpfq_a7691d3c-03b7-43d0-81bc-ac0093b41925/metrics-server/0.log" Apr 22 18:26:06.048519 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.048490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-kfnt6_fa69e6f8-c5fc-4048-8b7f-2c2fb289e4f1/monitoring-plugin/0.log" Apr 22 18:26:06.328364 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.328269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z8qr6_07a60a53-3eab-4583-b7f4-5a08a4917cbc/node-exporter/0.log" Apr 22 18:26:06.367234 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.367177 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z8qr6_07a60a53-3eab-4583-b7f4-5a08a4917cbc/kube-rbac-proxy/0.log" Apr 22 18:26:06.408673 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.408641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z8qr6_07a60a53-3eab-4583-b7f4-5a08a4917cbc/init-textfile/0.log" Apr 22 18:26:06.840679 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.840646 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74cccdf67f-vn6dv_13720c60-d2a8-4710-949a-93943bbd1473/telemeter-client/0.log" Apr 22 18:26:06.877038 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.876849 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74cccdf67f-vn6dv_13720c60-d2a8-4710-949a-93943bbd1473/reload/0.log" Apr 22 18:26:06.914797 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.914732 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74cccdf67f-vn6dv_13720c60-d2a8-4710-949a-93943bbd1473/kube-rbac-proxy/0.log" Apr 22 18:26:06.959597 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.959570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dc466d4b4-2dxrf_a76b1b55-58f0-4b4a-b1f5-24d7623a8c92/thanos-query/0.log" Apr 22 18:26:06.987688 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:06.987660 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dc466d4b4-2dxrf_a76b1b55-58f0-4b4a-b1f5-24d7623a8c92/kube-rbac-proxy-web/0.log" Apr 22 18:26:07.015257 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:07.015216 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dc466d4b4-2dxrf_a76b1b55-58f0-4b4a-b1f5-24d7623a8c92/kube-rbac-proxy/0.log" Apr 22 18:26:07.048873 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:07.048831 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dc466d4b4-2dxrf_a76b1b55-58f0-4b4a-b1f5-24d7623a8c92/prom-label-proxy/0.log" Apr 22 18:26:07.078401 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:07.078376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dc466d4b4-2dxrf_a76b1b55-58f0-4b4a-b1f5-24d7623a8c92/kube-rbac-proxy-rules/0.log" Apr 22 18:26:07.109109 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:07.109079 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6dc466d4b4-2dxrf_a76b1b55-58f0-4b4a-b1f5-24d7623a8c92/kube-rbac-proxy-metrics/0.log" Apr 22 18:26:07.999027 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:07.998998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-56zln_9be38d04-fbc0-4977-b082-c0568fd4d108/networking-console-plugin/0.log" Apr 22 18:26:08.371764 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.371736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/1.log" Apr 22 18:26:08.381594 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.381563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fx8gr_c400b749-c41a-4dc5-908a-d49ec568c6d6/console-operator/2.log" Apr 22 18:26:08.500973 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.500933 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6"] Apr 22 18:26:08.504790 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.504765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.511309 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.511269 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6"] Apr 22 18:26:08.584914 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.584882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-proc\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.584914 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.584919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-podres\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.585111 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.584957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-lib-modules\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.585111 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.585010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzp5m\" (UniqueName: \"kubernetes.io/projected/c67555a3-e51c-4c24-8f71-2a5b6d91630d-kube-api-access-zzp5m\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.585111 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.585061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-sys\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686090 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.685999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-sys\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686090 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-proc\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-podres\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-proc\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-lib-modules\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-sys\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-podres\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686360 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c67555a3-e51c-4c24-8f71-2a5b6d91630d-lib-modules\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.686668 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.686368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzp5m\" (UniqueName: \"kubernetes.io/projected/c67555a3-e51c-4c24-8f71-2a5b6d91630d-kube-api-access-zzp5m\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.695352 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.695279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzp5m\" (UniqueName: \"kubernetes.io/projected/c67555a3-e51c-4c24-8f71-2a5b6d91630d-kube-api-access-zzp5m\") pod \"perf-node-gather-daemonset-9k2j6\" (UID: \"c67555a3-e51c-4c24-8f71-2a5b6d91630d\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.757434 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.757402 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b7df59464-7p7pt_3f33e8ba-7723-4eea-91eb-a4ef03bf4f08/console/0.log" Apr 22 18:26:08.816815 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.816778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:08.952332 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:08.952005 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6"] Apr 22 18:26:09.025582 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:09.025553 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" event={"ID":"c67555a3-e51c-4c24-8f71-2a5b6d91630d","Type":"ContainerStarted","Data":"8223914c321820629055ea884ec16821375820bf73319c6a6dc6f2f7d068c04e"} Apr 22 18:26:09.794960 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:09.794926 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6wjzz_047091e6-d56c-4d8b-8391-f6285a93c154/dns/0.log" Apr 22 18:26:09.817921 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:09.817894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6wjzz_047091e6-d56c-4d8b-8391-f6285a93c154/kube-rbac-proxy/0.log" Apr 22 18:26:09.985975 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:09.985947 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xxh8b_fb638d9e-ea2e-4a2e-979e-308022903fd1/dns-node-resolver/0.log" Apr 22 18:26:10.030697 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:10.030655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" event={"ID":"c67555a3-e51c-4c24-8f71-2a5b6d91630d","Type":"ContainerStarted","Data":"8b785db5facb28f005be0fadbc662152bd9da4ec92ba644b2e62977397e27675"} Apr 22 18:26:10.031518 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:10.031491 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:10.050153 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:10.050033 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" podStartSLOduration=2.050013569 podStartE2EDuration="2.050013569s" podCreationTimestamp="2026-04-22 18:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:26:10.048405235 +0000 UTC m=+333.757750887" watchObservedRunningTime="2026-04-22 18:26:10.050013569 +0000 UTC m=+333.759359223" Apr 22 18:26:10.391124 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:10.391055 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4nkxw_073609fd-8186-41f7-860d-4fd136656e3f/node-ca/0.log" Apr 22 18:26:11.115487 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:11.115456 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-67569b586-r5chq_7852086c-5476-41cd-9f89-8347b77c52a6/router/0.log" Apr 22 18:26:11.437110 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:11.437023 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4kzvf_fa8d3113-51fb-4375-ab5a-40c379dabdaa/serve-healthcheck-canary/0.log" Apr 22 18:26:12.001234 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:12.001205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zgzrx_290f97ac-a696-4ad0-a131-8cee16862b82/kube-rbac-proxy/0.log" Apr 22 18:26:12.025331 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:12.025299 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zgzrx_290f97ac-a696-4ad0-a131-8cee16862b82/exporter/0.log" Apr 22 18:26:12.048518 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:12.048496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zgzrx_290f97ac-a696-4ad0-a131-8cee16862b82/extractor/0.log" Apr 22 18:26:16.371905 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:16.371874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9ndl7_ad8365e0-e003-4937-9dbd-1989580ac1f4/kube-storage-version-migrator-operator/1.log" Apr 22 18:26:16.373634 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:16.373606 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-9ndl7_ad8365e0-e003-4937-9dbd-1989580ac1f4/kube-storage-version-migrator-operator/0.log" Apr 22 18:26:17.047546 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.047516 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-9k2j6" Apr 22 18:26:17.358411 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.358343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/kube-multus-additional-cni-plugins/0.log" Apr 22 18:26:17.381196 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.381170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/egress-router-binary-copy/0.log" Apr 22 18:26:17.403208 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.403184 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/cni-plugins/0.log" Apr 22 18:26:17.425977 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.425946 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/bond-cni-plugin/0.log" Apr 22 18:26:17.449545 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.449518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/routeoverride-cni/0.log" Apr 22 18:26:17.471852 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.471824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/whereabouts-cni-bincopy/0.log" Apr 22 18:26:17.494631 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.494607 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s9sjx_a9f82cba-e68b-4160-a7cd-232b0875487d/whereabouts-cni/0.log" Apr 22 18:26:17.750000 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.749968 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slx9s_2a55ce37-0ff0-47e0-92a1-9f75d384f77e/kube-multus/0.log" Apr 22 18:26:17.774805 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.774770 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dhwbm_1f085cfa-07bb-457b-85ce-79f190f3ecb1/network-metrics-daemon/0.log" Apr 22 18:26:17.797940 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:17.797889 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dhwbm_1f085cfa-07bb-457b-85ce-79f190f3ecb1/kube-rbac-proxy/0.log" Apr 22 18:26:19.371036 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.370949 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-controller/0.log" Apr 22 18:26:19.390462 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.390431 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/0.log" Apr 22 18:26:19.393594 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.393563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovn-acl-logging/1.log" Apr 22 18:26:19.417662 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.417637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/kube-rbac-proxy-node/0.log" Apr 22 18:26:19.447563 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.447529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:26:19.466930 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.466900 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/northd/0.log" Apr 22 18:26:19.490262 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.490222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/nbdb/0.log" Apr 22 18:26:19.515355 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.515323 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/sbdb/0.log" Apr 22 18:26:19.681410 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:19.681330 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s8sh5_874d96b9-6b68-4589-95c9-72b4b398e980/ovnkube-controller/0.log" Apr 22 18:26:20.827697 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:20.827672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tx2k7_01e67bdc-8f43-4b2e-8cef-8d84eb59aabd/network-check-target-container/0.log" Apr 22 18:26:21.675836 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:21.675812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tjcwf_f219549d-a221-4934-9036-877b25fa0d00/iptables-alerter/0.log" Apr 22 18:26:22.409985 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:22.409956 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5r97b_1a492888-e464-4422-ac7a-f260e0cd42aa/tuned/0.log" Apr 22 18:26:24.191298 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:24.191267 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-kt4g4_2ee4fcb8-c34f-4bf2-9d97-11586231a008/cluster-samples-operator/0.log" Apr 22 18:26:24.210682 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:24.210656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-kt4g4_2ee4fcb8-c34f-4bf2-9d97-11586231a008/cluster-samples-operator-watch/0.log" Apr 22 18:26:25.419479 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:25.419449 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-q425x_36c89fa6-510b-4d64-accf-ae9f90020ee3/service-ca-controller/0.log" Apr 22 18:26:25.744098 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:25.744069 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-bzgjb_8d08824f-8af6-46ef-b54e-409f85817ae0/csi-driver/0.log" Apr 22 18:26:25.766000 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:25.765970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-bzgjb_8d08824f-8af6-46ef-b54e-409f85817ae0/csi-node-driver-registrar/0.log" Apr 22 18:26:25.787236 ip-10-0-140-74 kubenswrapper[2575]: I0422 18:26:25.787218 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-bzgjb_8d08824f-8af6-46ef-b54e-409f85817ae0/csi-liveness-probe/0.log"