Apr 21 15:10:35.466150 ip-10-0-137-168 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:10:35.931604 ip-10-0-137-168 kubenswrapper[2544]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:35.931604 ip-10-0-137-168 kubenswrapper[2544]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:10:35.931604 ip-10-0-137-168 kubenswrapper[2544]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:35.931604 ip-10-0-137-168 kubenswrapper[2544]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:10:35.931604 ip-10-0-137-168 kubenswrapper[2544]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:35.933165 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.933067 2544 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:10:35.941981 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941949 2544 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:35.941981 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941972 2544 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:35.941981 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941976 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:35.941981 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941980 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:35.941981 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941984 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:35.941981 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941988 2544 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941993 2544 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.941997 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942001 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942004 2544 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942007 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942012 2544 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942016 2544 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942019 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942021 2544 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942024 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942027 2544 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942029 2544 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942032 2544 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942035 2544 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942037 2544 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942040 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942042 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942045 2544 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942048 2544 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:35.942229 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942050 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942053 2544 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942055 2544 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942058 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942061 2544 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942066 2544 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942069 2544 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942072 2544 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942074 2544 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942077 2544 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942080 2544 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942082 2544 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942084 2544 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942087 2544 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942091 2544 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942094 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942097 2544 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942099 2544 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942102 2544 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942104 2544 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:35.942709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942107 2544 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942110 2544 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942112 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942116 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942119 2544 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942122 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942124 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942127 2544 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942129 2544 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942132 2544 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942135 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942138 2544 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942140 2544 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942143 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942146 2544 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942149 2544 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942151 2544 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942154 2544 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942156 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942159 2544 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:35.943220 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942161 2544 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942164 2544 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942167 2544 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942169 2544 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942172 2544 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942174 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942179 2544 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942182 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942185 2544 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942188 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942191 2544 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942194 2544 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942196 2544 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942200 2544 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942203 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942206 2544 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942209 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942211 2544 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942214 2544 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:35.943695 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942217 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942219 2544 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942672 2544 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942678 2544 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942680 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942683 2544 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942686 2544 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942688 2544 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942692 2544 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942696 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942699 2544 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942702 2544 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942704 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942707 2544 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942710 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942713 2544 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942716 2544 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942718 2544 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942721 2544 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942725 2544 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:35.944200 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942728 2544 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942731 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942734 2544 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942736 2544 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942739 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942742 2544 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942745 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942765 2544 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942771 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942774 2544 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942777 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942779 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942782 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942784 2544 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942787 2544 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942789 2544 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942792 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942795 2544 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942797 2544 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942801 2544 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:35.944689 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942803 2544 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942806 2544 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942809 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942811 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942814 2544 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942817 2544 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942819 2544 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942821 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942824 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942827 2544 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942829 2544 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942832 2544 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942836 2544 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942838 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942841 2544 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942844 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942847 2544 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942849 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942852 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942855 2544 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:35.945299 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942858 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942860 2544 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942864 2544 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942868 2544 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942872 2544 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942874 2544 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942877 2544 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942880 2544 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942882 2544 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942885 2544 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942888 2544 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942890 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942892 2544 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942895 2544 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942897 2544 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942900 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942902 2544 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942905 2544 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942907 2544 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942910 2544 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:35.945847 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942912 2544 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942915 2544 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942917 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942921 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942923 2544 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942926 2544 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942929 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.942932 2544 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943008 2544 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943016 2544 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943023 2544 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943028 2544 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943033 2544 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943037 2544 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943043 2544 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943050 2544 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943053 2544 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943056 2544 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943060 2544 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943063 2544 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943066 2544 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943069 2544 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:10:35.946328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943072 2544 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943075 2544 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943078 2544 flags.go:64] FLAG: --cloud-config="" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943081 2544 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943084 2544 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943088 2544 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943091 2544 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943095 2544 flags.go:64] FLAG: --config-dir="" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943098 2544 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943101 2544 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943105 2544 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943108 2544 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943111 2544 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943116 2544 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943119 2544 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943122 2544 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943125 2544 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943129 2544 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943132 2544 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943136 2544 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943140 2544 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943143 2544 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943146 2544 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943150 2544 flags.go:64] FLAG: --enable-server="true" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943154 2544 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:10:35.946925 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943159 2544 flags.go:64] FLAG: --event-burst="100" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943162 2544 flags.go:64] FLAG: --event-qps="50" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943165 2544 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943168 2544 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943172 2544 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943176 2544 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943179 2544 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943182 2544 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943186 2544 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943189 2544 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943192 2544 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943195 2544 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943198 2544 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943201 2544 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943204 2544 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943207 2544 flags.go:64] FLAG: --feature-gates="" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943211 2544 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943214 2544 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943217 2544 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943220 2544 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943224 2544 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943227 2544 flags.go:64] FLAG: --help="false" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943230 2544 flags.go:64] FLAG: --hostname-override="ip-10-0-137-168.ec2.internal" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943234 2544 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:10:35.947542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943237 2544 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943240 2544 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943243 2544 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943247 2544 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943250 2544 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943252 2544 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943255 2544 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943258 2544 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943261 2544 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943265 2544 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943267 2544 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943270 2544 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943273 2544 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943277 2544 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943280 2544 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943282 2544 flags.go:64] FLAG: --lock-file="" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943285 2544 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943288 2544 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943291 2544 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943297 2544 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943300 2544 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943302 2544 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943305 2544 flags.go:64] FLAG: --logging-format="text" Apr 21 15:10:35.948151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943308 2544 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943312 2544 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943315 2544 flags.go:64] FLAG: --manifest-url="" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943318 2544 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943322 2544 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943325 2544 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943330 2544 flags.go:64] FLAG: --max-pods="110" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943333 2544 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943336 2544 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943339 2544 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943342 2544 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943345 2544 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943348 2544 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943351 2544 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943359 2544 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943362 2544 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943365 2544 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943368 2544 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943372 2544 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943377 2544 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943380 2544 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943383 2544 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943386 2544 flags.go:64] FLAG: --port="10250" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943390 2544 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:10:35.948713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943393 2544 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-052de07ae2e8ecda4" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943396 2544 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943399 2544 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943402 2544 flags.go:64] FLAG: --register-node="true" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943405 2544 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943408 2544 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943411 2544 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943414 2544 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943417 2544 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943420 2544 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943424 2544 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943427 2544 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943435 2544 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943438 2544 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943441 2544 flags.go:64] FLAG: --runonce="false" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943444 2544 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943448 2544 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943451 2544 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943454 2544 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943456 2544 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943459 2544 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943463 2544 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943466 2544 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943468 2544 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943471 2544 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943474 2544 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:10:35.949327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943478 2544 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943482 2544 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943485 2544 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943488 2544 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943494 2544 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943497 2544 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943500 2544 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943504 2544 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943507 2544 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943513 2544 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943516 2544 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943519 2544 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943522 2544 flags.go:64] FLAG: --v="2" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943527 2544 flags.go:64] FLAG: --version="false" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943530 2544 flags.go:64] FLAG: --vmodule="" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943535 2544 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943538 2544 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943631 2544 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943637 2544 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943639 2544 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943642 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943645 2544 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943648 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943650 2544 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:35.949999 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943653 2544 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943655 2544 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943658 2544 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943661 2544 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943663 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943666 2544 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943669 2544 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943671 2544 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943674 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943679 2544 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943682 2544 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943685 2544 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943690 2544 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943693 2544 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943696 2544 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943699 2544 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943702 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943706 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943709 2544 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:35.950602 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943713 2544 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943716 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943719 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943722 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943724 2544 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943727 2544 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943730 2544 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943733 2544 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943736 2544 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943739 2544 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943741 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943744 2544 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943746 2544 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943765 2544 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943769 2544 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943772 2544 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943775 2544 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943778 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943780 2544 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943783 2544 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:35.951138 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943786 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943789 2544 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943791 2544 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943795 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943798 2544 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943802 2544 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943805 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943807 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943810 2544 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943813 2544 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943815 2544 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943818 2544 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943821 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943824 2544 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943827 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943829 2544 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943832 2544 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943835 2544 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943837 2544 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:35.951636 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943841 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943844 2544 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943846 2544 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943849 2544 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943852 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943855 2544 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943857 2544 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943860 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943862 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943865 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943867 2544 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943870 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943873 2544 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943875 2544 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943878 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943881 2544 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943883 2544 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943886 2544 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943890 2544 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943892 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:35.952122 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.943895 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.943904 2544 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.950699 2544 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.950720 2544 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950804 2544 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950810 2544 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950813 2544 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950816 2544 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950820 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950823 2544 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950826 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950829 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950831 2544 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950834 2544 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950837 2544 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:35.952622 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950839 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950842 2544 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950845 2544 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950847 2544 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950851 2544 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950854 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950857 2544 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950861 2544 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950863 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950866 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950868 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950871 2544 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950874 2544 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950877 2544 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950880 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950883 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950885 2544 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950888 2544 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950891 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950894 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:35.953059 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950898 2544 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950900 2544 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950903 2544 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950905 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950908 2544 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950911 2544 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950914 2544 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950916 2544 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950919 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950921 2544 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950924 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950927 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950929 2544 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950933 2544 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950935 2544 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950938 2544 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950940 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950943 2544 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950946 2544 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950949 2544 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:35.953551 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950952 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950956 2544 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950962 2544 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950966 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950969 2544 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950972 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950974 2544 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950977 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950980 2544 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950982 2544 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950985 2544 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950987 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950990 2544 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950993 2544 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950996 2544 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.950999 2544 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951001 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951004 2544 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951006 2544 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:35.954052 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951009 2544 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951011 2544 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951014 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951016 2544 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951019 2544 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951022 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951025 2544 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951029 2544 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951032 2544 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951035 2544 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951038 2544 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951042 2544 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951045 2544 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951048 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951051 2544 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:35.954519 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951053 2544 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.951059 2544 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951177 2544 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951182 2544 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951185 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951188 2544 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951191 2544 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951194 2544 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951196 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951200 2544 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951203 2544 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951205 2544 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951209 2544 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951212 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951214 2544 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:35.954910 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951217 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951220 2544 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951222 2544 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951224 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951227 2544 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951230 2544 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951232 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951235 2544 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951237 2544 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951240 2544 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951242 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951245 2544 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951247 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951250 2544 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951253 2544 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951256 2544 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951259 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951261 2544 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951264 2544 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951266 2544 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:35.955292 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951269 2544 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951272 2544 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951274 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951277 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951279 2544 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951282 2544 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951284 2544 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951287 2544 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951289 2544 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951292 2544 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951295 2544 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951298 2544 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951301 2544 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951303 2544 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951305 2544 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951308 2544 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951310 2544 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951313 2544 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951315 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:35.955796 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951318 2544 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951320 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951323 2544 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951325 2544 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951328 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951331 2544 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951333 2544 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951336 2544 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951338 2544 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951342 2544 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951345 2544 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951349 2544 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951352 2544 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951355 2544 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951358 2544 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951361 2544 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951363 2544 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951366 2544 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951369 2544 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951372 2544 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:35.956288 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951374 2544 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951377 2544 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951379 2544 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951381 2544 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951385 2544 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951387 2544 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951390 2544 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951392 2544 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951395 2544 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951397 2544 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951399 2544 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951403 2544 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951406 2544 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:35.951409 2544 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.951414 2544 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:35.956957 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.952206 2544 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:10:35.957358 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.956011 2544 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:10:35.957358 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.957044 2544 server.go:1019] "Starting client certificate rotation" Apr 21 15:10:35.957358 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.957141 2544 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:10:35.957358 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.957175 2544 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:10:35.987800 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.987780 2544 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:10:35.989867 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:35.989849 2544 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:10:36.005584 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.005557 2544 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:10:36.011361 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.011343 2544 log.go:25] "Validated CRI v1 image API" Apr 21 15:10:36.014593 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.014568 2544 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:10:36.014734 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.014684 2544 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:10:36.017673 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.017643 2544 fs.go:135] Filesystem UUIDs: map[0d8251f2-e3b3-4052-8f74-cd495fa522e2:/dev/nvme0n1p3 600ab70d-bcb0-4741-8ecf-2f3eea83f963:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 21 15:10:36.017785 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.017671 2544 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:10:36.022917 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.022789 2544 manager.go:217] Machine: {Timestamp:2026-04-21 15:10:36.021548984 +0000 UTC m=+0.427957239 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3105567 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20aabe93b18479c50c570b0bf0c53d SystemUUID:ec20aabe-93b1-8479-c50c-570b0bf0c53d BootID:4aca274e-c607-4735-bcaf-5ccc96af5659 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e5:de:a2:8f:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e5:de:a2:8f:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:ad:4f:53:64:65 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:10:36.022917 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.022903 2544 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:10:36.023094 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.023026 2544 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:10:36.024833 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.024800 2544 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:10:36.025018 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.024833 2544 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-168.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:10:36.025096 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.025037 2544 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:10:36.025096 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.025049 2544 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:10:36.025096 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.025068 2544 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:10:36.025963 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.025950 2544 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:10:36.028131 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.028117 2544 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:10:36.028262 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.028251 2544 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:10:36.030501 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.030490 2544 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:10:36.031227 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.031217 2544 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:10:36.031290 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.031244 2544 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:10:36.031290 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.031259 2544 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:10:36.031290 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.031272 2544 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:10:36.032264 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.032250 2544 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:10:36.032329 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.032273 2544 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:10:36.035067 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.035043 2544 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:10:36.036636 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.036621 2544 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:10:36.038621 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038605 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038627 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038637 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038646 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038654 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038662 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038672 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:10:36.038680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038680 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:10:36.038894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038691 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:10:36.038894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038700 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:10:36.038894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038712 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:10:36.038894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.038723 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:10:36.039572 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.039561 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:10:36.039607 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.039574 2544 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:10:36.042079 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.042059 2544 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4cppx" Apr 21 15:10:36.043073 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.043059 2544 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:10:36.043155 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.043102 2544 server.go:1295] "Started kubelet" Apr 21 15:10:36.044516 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.044473 2544 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:10:36.044653 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.044620 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-168.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:10:36.045208 ip-10-0-137-168 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:10:36.046527 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.046508 2544 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-168.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:10:36.046873 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.046685 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:10:36.050379 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.048018 2544 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:10:36.050578 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.050561 2544 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:10:36.051984 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.051960 2544 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4cppx" Apr 21 15:10:36.052200 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.052176 2544 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:10:36.052761 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.052732 2544 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:10:36.055154 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.054153 2544 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-168.ec2.internal.18a867d94fdc74d4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-168.ec2.internal,UID:ip-10-0-137-168.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-168.ec2.internal,},FirstTimestamp:2026-04-21 15:10:36.043072724 +0000 UTC m=+0.449480977,LastTimestamp:2026-04-21 15:10:36.043072724 +0000 UTC m=+0.449480977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-168.ec2.internal,}" Apr 21 15:10:36.058020 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.057998 2544 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:10:36.058427 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.058409 2544 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:10:36.058995 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.058971 2544 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:10:36.059224 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.059191 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.059347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059314 2544 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:10:36.059347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059314 2544 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:10:36.059347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059340 2544 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:10:36.059481 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059394 2544 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:10:36.059481 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059404 2544 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:10:36.059567 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059560 2544 factory.go:153] Registering CRI-O factory Apr 21 15:10:36.059615 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059601 2544 factory.go:223] Registration of the crio container factory successfully Apr 21 15:10:36.059703 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059645 2544 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:10:36.059703 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059652 2544 factory.go:55] Registering systemd factory Apr 21 15:10:36.059703 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059657 2544 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:10:36.059703 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059673 2544 factory.go:103] Registering Raw factory Apr 21 15:10:36.059703 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.059681 2544 manager.go:1196] Started watching for new ooms in manager Apr 21 15:10:36.060052 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.060040 2544 manager.go:319] Starting recovery of all containers Apr 21 15:10:36.061531 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.061512 2544 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:36.064309 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.064285 2544 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-168.ec2.internal\" not found" node="ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.068573 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.068555 2544 manager.go:324] Recovery completed Apr 21 15:10:36.072588 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.072574 2544 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:36.075223 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.075206 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:36.075298 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.075241 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:36.075298 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.075252 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:36.075744 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.075729 2544 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:10:36.075744 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.075742 2544 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:10:36.075903 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.075778 2544 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:10:36.078166 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.078151 2544 policy_none.go:49] "None policy: Start" Apr 21 15:10:36.078238 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.078170 2544 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:10:36.078238 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.078183 2544 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:10:36.124399 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124380 2544 manager.go:341] "Starting Device Plugin manager" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.124417 2544 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124429 2544 server.go:85] "Starting device plugin registration server" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124700 2544 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124711 2544 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124831 2544 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124893 2544 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.124901 2544 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.125808 2544 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:10:36.127103 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.125841 2544 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.200049 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.199957 2544 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:10:36.201229 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.201212 2544 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:10:36.201322 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.201242 2544 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:10:36.201322 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.201270 2544 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:10:36.201322 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.201279 2544 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:10:36.201431 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.201321 2544 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:10:36.204131 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.204111 2544 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:36.224832 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.224806 2544 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:36.225953 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.225936 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:36.226055 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.225965 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:36.226055 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.225975 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:36.226055 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.226003 2544 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.235625 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.235588 2544 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.235625 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.235622 2544 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-168.ec2.internal\": node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.256872 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.256851 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.301919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.301888 2544 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal"] Apr 21 15:10:36.301999 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.301967 2544 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:36.304669 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.304643 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:36.304834 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.304687 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:36.304834 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.304703 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:36.306232 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.306214 2544 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:36.306371 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.306348 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.306481 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.306403 2544 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:36.307091 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.307073 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:36.307191 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.307098 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:36.307191 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.307109 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:36.307191 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.307081 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:36.307191 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.307179 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:36.307381 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.307196 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:36.308451 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.308436 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.308507 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.308467 2544 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:36.309196 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.309180 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:36.309277 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.309203 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:36.309277 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.309214 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:36.335706 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.335676 2544 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-168.ec2.internal\" not found" node="ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.340241 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.340221 2544 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-168.ec2.internal\" not found" node="ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.357157 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.357125 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.457839 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.457724 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.461034 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.461014 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6fce942d3fd65daddcc4345d05bd271a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal\" (UID: \"6fce942d3fd65daddcc4345d05bd271a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.461132 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.461048 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce942d3fd65daddcc4345d05bd271a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal\" (UID: \"6fce942d3fd65daddcc4345d05bd271a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.461132 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.461076 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d8e31eb06d4e26e4e1d995160527f9b1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-168.ec2.internal\" (UID: \"d8e31eb06d4e26e4e1d995160527f9b1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.558476 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.558438 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.561731 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.561709 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6fce942d3fd65daddcc4345d05bd271a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal\" (UID: \"6fce942d3fd65daddcc4345d05bd271a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.561819 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.561742 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce942d3fd65daddcc4345d05bd271a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal\" (UID: \"6fce942d3fd65daddcc4345d05bd271a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.561819 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.561784 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d8e31eb06d4e26e4e1d995160527f9b1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-168.ec2.internal\" (UID: \"d8e31eb06d4e26e4e1d995160527f9b1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.561886 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.561827 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d8e31eb06d4e26e4e1d995160527f9b1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-168.ec2.internal\" (UID: \"d8e31eb06d4e26e4e1d995160527f9b1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.561886 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.561838 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fce942d3fd65daddcc4345d05bd271a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal\" (UID: \"6fce942d3fd65daddcc4345d05bd271a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.561886 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.561829 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6fce942d3fd65daddcc4345d05bd271a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal\" (UID: \"6fce942d3fd65daddcc4345d05bd271a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.638923 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.638893 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.642839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.642817 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" Apr 21 15:10:36.659492 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.659461 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.759975 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.759885 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.860352 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.860322 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.956733 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.956692 2544 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:10:36.957451 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.956879 2544 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:10:36.957451 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.956915 2544 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:10:36.960992 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:36.960971 2544 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-168.ec2.internal\" not found" Apr 21 15:10:36.975856 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:36.975829 2544 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:37.032190 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.032099 2544 apiserver.go:52] "Watching apiserver" Apr 21 15:10:37.041528 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.041498 2544 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:10:37.041963 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.041942 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-sm5q9","openshift-network-operator/iptables-alerter-hmf4l","openshift-image-registry/node-ca-9lsjs","openshift-multus/multus-additional-cni-plugins-r8gk4","openshift-multus/network-metrics-daemon-84hkv","openshift-ovn-kubernetes/ovnkube-node-hj57q","kube-system/konnectivity-agent-5s2h5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s","openshift-cluster-node-tuning-operator/tuned-nv5tm","openshift-dns/node-resolver-tbnxq","openshift-multus/multus-rwxvz"] Apr 21 15:10:37.043881 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.043858 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:37.044001 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.043942 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:37.047396 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.047373 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.047495 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.047453 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.049413 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.049392 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.050213 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.050187 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.050569 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.050287 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:10:37.050569 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.050307 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:10:37.050569 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.050442 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.050725 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.050602 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.051181 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.051159 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.051830 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.051558 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4dxv9\"" Apr 21 15:10:37.051830 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.051619 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-596kz\"" Apr 21 15:10:37.051830 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.051677 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.052884 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.051877 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wpfg2\"" Apr 21 15:10:37.052884 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.052199 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.052884 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.052318 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:37.053086 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.053070 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:10:37.053678 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.053654 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:10:37.053826 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.053660 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.053826 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.053706 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:10:37.054845 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.054813 2544 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:05:36 +0000 UTC" deadline="2027-12-30 18:16:27.303506728 +0000 UTC" Apr 21 15:10:37.054845 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.054844 2544 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14835h5m50.248666518s" Apr 21 15:10:37.054990 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.054976 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.056992 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.056972 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.057633 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.057614 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.057737 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.057633 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:10:37.057737 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.057678 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:10:37.057737 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.057682 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:10:37.057924 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.057682 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.057924 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.057684 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h92dl\"" Apr 21 15:10:37.058096 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.058081 2544 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:10:37.058282 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.058260 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:10:37.058576 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.058560 2544 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" Apr 21 15:10:37.059300 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.059276 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.059434 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.059299 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9rlwd\"" Apr 21 15:10:37.059434 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.059338 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:10:37.060207 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.060162 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:10:37.061889 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.061873 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.062032 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.062014 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.062180 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.062020 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.062180 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.062162 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kb2k2\"" Apr 21 15:10:37.062293 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.062266 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:10:37.064301 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064279 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-system-cni-dir\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.064399 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064315 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-os-release\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.064399 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064359 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.064487 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064354 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.064536 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064522 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7vr\" (UniqueName: \"kubernetes.io/projected/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-kube-api-access-lw7vr\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.064584 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064556 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-run-netns\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.064661 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064596 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysctl-d\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.064661 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064621 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovnkube-script-lib\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.064839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064675 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-registration-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.064839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064707 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-node-log\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.064839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064734 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/013ca451-549d-478c-941e-0e9994b24c34-iptables-alerter-script\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.064839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064805 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/67d06525-c590-4bd2-8237-2f7fa8b1d779-agent-certs\") pod \"konnectivity-agent-5s2h5\" (UID: \"67d06525-c590-4bd2-8237-2f7fa8b1d779\") " pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.064839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064831 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/67d06525-c590-4bd2-8237-2f7fa8b1d779-konnectivity-ca\") pod \"konnectivity-agent-5s2h5\" (UID: \"67d06525-c590-4bd2-8237-2f7fa8b1d779\") " pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064876 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-socket-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064899 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a30a0d0-d747-4359-a316-b1d4215f71f3-tmp\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064925 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh8cf\" (UniqueName: \"kubernetes.io/projected/8a30a0d0-d747-4359-a316-b1d4215f71f3-kube-api-access-jh8cf\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064934 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064956 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064958 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-cni-netd\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.064989 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-g5fts\"" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065008 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovn-node-metrics-cert\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065032 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065052 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-systemd\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065076 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065091 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-sys-fs\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065106 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065126 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbln\" (UniqueName: \"kubernetes.io/projected/c23d088b-b54c-4874-aac7-248c3a09117a-kube-api-access-dwbln\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.065163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065163 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-cni-binary-copy\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065185 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065211 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-kubernetes\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065234 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-lib-modules\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065251 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-systemd-units\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065273 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-etc-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065296 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-ovn\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065311 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-run-ovn-kubernetes\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065325 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c23d088b-b54c-4874-aac7-248c3a09117a-host\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065348 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c23d088b-b54c-4874-aac7-248c3a09117a-serviceca\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065367 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-sys\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065382 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-slash\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065400 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-var-lib-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065420 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-log-socket\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065443 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6qb\" (UniqueName: \"kubernetes.io/projected/49309d1e-9f62-4e92-8114-acfac3171dc5-kube-api-access-ch6qb\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065489 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-tuned\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.065746 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065522 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-kubelet\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065555 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-cni-bin\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065610 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065629 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmjl\" (UniqueName: \"kubernetes.io/projected/013ca451-549d-478c-941e-0e9994b24c34-kube-api-access-wqmjl\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065651 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-var-lib-kubelet\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065673 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-run\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065690 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/013ca451-549d-478c-941e-0e9994b24c34-host-slash\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065705 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-cnibin\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065731 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6tfr\" (UniqueName: \"kubernetes.io/projected/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-kube-api-access-d6tfr\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065770 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-host\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065790 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-modprobe-d\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065808 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-device-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065832 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovnkube-config\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065875 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-env-overrides\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065921 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j9lm\" (UniqueName: \"kubernetes.io/projected/d048a9ec-b8d0-42a8-9384-5fe347a8873b-kube-api-access-6j9lm\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.065966 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.066692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066001 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysconfig\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066025 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysctl-conf\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066048 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066071 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-systemd\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066095 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066630 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066714 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066886 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kf7kv\"" Apr 21 15:10:37.067324 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.066967 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.068193 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.068175 2544 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:10:37.069178 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.069162 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:10:37.069252 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.069234 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j4nnx\"" Apr 21 15:10:37.072539 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.072519 2544 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:10:37.072666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.072595 2544 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" Apr 21 15:10:37.072666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.072609 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal"] Apr 21 15:10:37.079489 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.079470 2544 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:10:37.079683 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.079667 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal"] Apr 21 15:10:37.089645 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.089445 2544 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7n95j" Apr 21 15:10:37.099023 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.099000 2544 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7n95j" Apr 21 15:10:37.160832 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.160802 2544 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:10:37.166574 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166544 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-system-cni-dir\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.166715 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166588 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-conf-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.166715 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166620 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-os-release\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.166715 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166649 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.166715 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166659 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-system-cni-dir\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.166715 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166675 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7vr\" (UniqueName: \"kubernetes.io/projected/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-kube-api-access-lw7vr\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166702 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-run-netns\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166721 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-os-release\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166747 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysctl-d\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166797 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166808 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-run-netns\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166830 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovnkube-script-lib\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166850 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-registration-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166876 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-node-log\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.166897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166894 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/013ca451-549d-478c-941e-0e9994b24c34-iptables-alerter-script\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166908 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysctl-d\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166909 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/67d06525-c590-4bd2-8237-2f7fa8b1d779-agent-certs\") pod \"konnectivity-agent-5s2h5\" (UID: \"67d06525-c590-4bd2-8237-2f7fa8b1d779\") " pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166949 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-registration-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166959 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-node-log\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.166958 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/67d06525-c590-4bd2-8237-2f7fa8b1d779-konnectivity-ca\") pod \"konnectivity-agent-5s2h5\" (UID: \"67d06525-c590-4bd2-8237-2f7fa8b1d779\") " pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167019 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-socket-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167056 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5594cf9-4b09-4077-9a0a-c6e1e4145792-hosts-file\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167084 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-system-cni-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167112 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-cnibin\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167137 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a30a0d0-d747-4359-a316-b1d4215f71f3-tmp\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167163 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh8cf\" (UniqueName: \"kubernetes.io/projected/8a30a0d0-d747-4359-a316-b1d4215f71f3-kube-api-access-jh8cf\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167164 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-socket-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167204 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-cni-netd\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167230 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovn-node-metrics-cert\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167229 2544 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167253 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167279 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-cni-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.167302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167306 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-k8s-cni-cncf-io\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167354 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-cni-bin\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167381 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-systemd\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167417 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167427 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-cni-netd\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167449 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-sys-fs\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167476 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-kubelet\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167505 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7scx\" (UniqueName: \"kubernetes.io/projected/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-kube-api-access-n7scx\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167534 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/013ca451-549d-478c-941e-0e9994b24c34-iptables-alerter-script\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167540 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167588 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/67d06525-c590-4bd2-8237-2f7fa8b1d779-konnectivity-ca\") pod \"konnectivity-agent-5s2h5\" (UID: \"67d06525-c590-4bd2-8237-2f7fa8b1d779\") " pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167608 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbln\" (UniqueName: \"kubernetes.io/projected/c23d088b-b54c-4874-aac7-248c3a09117a-kube-api-access-dwbln\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167635 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-cni-binary-copy\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167664 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167668 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167684 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovnkube-script-lib\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167695 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-daemon-config\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.168146 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167719 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-kubernetes\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167786 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-lib-modules\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167794 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-systemd\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167828 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-kubernetes\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.167883 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.167943 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.667923517 +0000 UTC m=+2.074331779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.167970 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168241 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-sys-fs\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168250 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-lib-modules\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168299 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-systemd-units\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168337 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-etc-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168349 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168361 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-ovn\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168387 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-cni-binary-copy\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168393 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-etc-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168392 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-systemd-units\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168440 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-run-ovn-kubernetes\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.168986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168463 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-ovn\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168479 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c23d088b-b54c-4874-aac7-248c3a09117a-host\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168494 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-run-ovn-kubernetes\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168505 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c23d088b-b54c-4874-aac7-248c3a09117a-serviceca\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168529 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c23d088b-b54c-4874-aac7-248c3a09117a-host\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168529 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-sys\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168569 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-sys\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168588 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-slash\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168615 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-var-lib-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168640 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-log-socket\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168656 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-slash\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168666 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6qb\" (UniqueName: \"kubernetes.io/projected/49309d1e-9f62-4e92-8114-acfac3171dc5-kube-api-access-ch6qb\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168695 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-var-lib-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168697 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpcfn\" (UniqueName: \"kubernetes.io/projected/c5594cf9-4b09-4077-9a0a-c6e1e4145792-kube-api-access-zpcfn\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168729 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-os-release\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168739 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-log-socket\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168771 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-tuned\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168801 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-kubelet\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.169789 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168827 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-cni-bin\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168852 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168878 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmjl\" (UniqueName: \"kubernetes.io/projected/013ca451-549d-478c-941e-0e9994b24c34-kube-api-access-wqmjl\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168927 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c23d088b-b54c-4874-aac7-248c3a09117a-serviceca\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168902 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-var-lib-kubelet\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168993 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-run\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.168988 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-var-lib-kubelet\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169050 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-kubelet\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169091 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-host-cni-bin\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169099 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169120 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/013ca451-549d-478c-941e-0e9994b24c34-host-slash\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169148 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-cnibin\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169173 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6tfr\" (UniqueName: \"kubernetes.io/projected/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-kube-api-access-d6tfr\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169217 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-multus-certs\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169222 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/013ca451-549d-478c-941e-0e9994b24c34-host-slash\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169245 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-host\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169269 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-modprobe-d\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.170595 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169297 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-device-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169337 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49309d1e-9f62-4e92-8114-acfac3171dc5-cnibin\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169350 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-device-dir\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169370 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-socket-dir-parent\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169397 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-host\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169401 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-etc-kubernetes\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169430 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovnkube-config\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169052 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-run\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169456 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-env-overrides\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169495 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j9lm\" (UniqueName: \"kubernetes.io/projected/d048a9ec-b8d0-42a8-9384-5fe347a8873b-kube-api-access-6j9lm\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169492 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-modprobe-d\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169527 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5594cf9-4b09-4077-9a0a-c6e1e4145792-tmp-dir\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169554 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169580 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysconfig\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169621 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysctl-conf\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169656 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169681 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-netns\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.171137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169707 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-cni-multus\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169735 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-hostroot\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169784 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-systemd\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169811 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169837 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-cni-binary-copy\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169896 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-env-overrides\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169930 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-openvswitch\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169973 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysconfig\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.169993 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d048a9ec-b8d0-42a8-9384-5fe347a8873b-run-systemd\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.170086 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-sysctl-conf\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.170174 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovnkube-config\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.170301 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a30a0d0-d747-4359-a316-b1d4215f71f3-tmp\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.170533 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d048a9ec-b8d0-42a8-9384-5fe347a8873b-ovn-node-metrics-cert\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.170587 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/67d06525-c590-4bd2-8237-2f7fa8b1d779-agent-certs\") pod \"konnectivity-agent-5s2h5\" (UID: \"67d06525-c590-4bd2-8237-2f7fa8b1d779\") " pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.171583 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.170597 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/49309d1e-9f62-4e92-8114-acfac3171dc5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.172017 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.171815 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8a30a0d0-d747-4359-a316-b1d4215f71f3-etc-tuned\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.175045 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.175025 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7vr\" (UniqueName: \"kubernetes.io/projected/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-kube-api-access-lw7vr\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.177987 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.177966 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbln\" (UniqueName: \"kubernetes.io/projected/c23d088b-b54c-4874-aac7-248c3a09117a-kube-api-access-dwbln\") pod \"node-ca-9lsjs\" (UID: \"c23d088b-b54c-4874-aac7-248c3a09117a\") " pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.181405 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.180936 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:37.181405 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.180964 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:37.181405 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.180979 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.181405 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.181063 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.681039559 +0000 UTC m=+2.087447798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.181978 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.181919 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh8cf\" (UniqueName: \"kubernetes.io/projected/8a30a0d0-d747-4359-a316-b1d4215f71f3-kube-api-access-jh8cf\") pod \"tuned-nv5tm\" (UID: \"8a30a0d0-d747-4359-a316-b1d4215f71f3\") " pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.183584 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.183561 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmjl\" (UniqueName: \"kubernetes.io/projected/013ca451-549d-478c-941e-0e9994b24c34-kube-api-access-wqmjl\") pod \"iptables-alerter-hmf4l\" (UID: \"013ca451-549d-478c-941e-0e9994b24c34\") " pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.184093 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.184074 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j9lm\" (UniqueName: \"kubernetes.io/projected/d048a9ec-b8d0-42a8-9384-5fe347a8873b-kube-api-access-6j9lm\") pod \"ovnkube-node-hj57q\" (UID: \"d048a9ec-b8d0-42a8-9384-5fe347a8873b\") " pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.184444 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.184425 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6tfr\" (UniqueName: \"kubernetes.io/projected/b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6-kube-api-access-d6tfr\") pod \"aws-ebs-csi-driver-node-dhk4s\" (UID: \"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.187095 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.187075 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6qb\" (UniqueName: \"kubernetes.io/projected/49309d1e-9f62-4e92-8114-acfac3171dc5-kube-api-access-ch6qb\") pod \"multus-additional-cni-plugins-r8gk4\" (UID: \"49309d1e-9f62-4e92-8114-acfac3171dc5\") " pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.229783 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.229499 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e31eb06d4e26e4e1d995160527f9b1.slice/crio-00297935a13685be4909fc678d7c9bcb6529d2eb04499492342a728f53d360d1 WatchSource:0}: Error finding container 00297935a13685be4909fc678d7c9bcb6529d2eb04499492342a728f53d360d1: Status 404 returned error can't find the container with id 00297935a13685be4909fc678d7c9bcb6529d2eb04499492342a728f53d360d1 Apr 21 15:10:37.234547 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.234527 2544 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:10:37.239574 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.239547 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fce942d3fd65daddcc4345d05bd271a.slice/crio-80770f2a3ecb5ca1b25ee150e9397b822da739fd826ccc868b6ed530a059a7f8 WatchSource:0}: Error finding container 80770f2a3ecb5ca1b25ee150e9397b822da739fd826ccc868b6ed530a059a7f8: Status 404 returned error can't find the container with id 80770f2a3ecb5ca1b25ee150e9397b822da739fd826ccc868b6ed530a059a7f8 Apr 21 15:10:37.270246 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270211 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-conf-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270246 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270251 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5594cf9-4b09-4077-9a0a-c6e1e4145792-hosts-file\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270268 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-system-cni-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270285 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-cnibin\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270308 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-cni-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270314 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-conf-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270332 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-k8s-cni-cncf-io\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270357 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-cni-bin\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270357 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c5594cf9-4b09-4077-9a0a-c6e1e4145792-hosts-file\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270375 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-cnibin\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270362 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-system-cni-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270389 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-kubelet\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270385 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-k8s-cni-cncf-io\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270406 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7scx\" (UniqueName: \"kubernetes.io/projected/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-kube-api-access-n7scx\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270408 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-cni-dir\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270419 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-cni-bin\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270448 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-kubelet\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.270500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270482 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-daemon-config\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270522 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpcfn\" (UniqueName: \"kubernetes.io/projected/c5594cf9-4b09-4077-9a0a-c6e1e4145792-kube-api-access-zpcfn\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270548 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-os-release\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270581 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-multus-certs\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270610 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-socket-dir-parent\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270636 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-etc-kubernetes\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270644 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-os-release\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270635 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-multus-certs\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270679 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-socket-dir-parent\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270685 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-etc-kubernetes\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270664 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5594cf9-4b09-4077-9a0a-c6e1e4145792-tmp-dir\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270733 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-netns\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270766 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-cni-multus\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270792 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-hostroot\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270792 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-run-netns\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270832 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-hostroot\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270833 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-cni-binary-copy\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270874 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-host-var-lib-cni-multus\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271163 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.270986 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5594cf9-4b09-4077-9a0a-c6e1e4145792-tmp-dir\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.271667 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.271189 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-cni-binary-copy\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.271667 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.271609 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-multus-daemon-config\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.279346 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.279311 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7scx\" (UniqueName: \"kubernetes.io/projected/dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b-kube-api-access-n7scx\") pod \"multus-rwxvz\" (UID: \"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b\") " pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.279346 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.279314 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpcfn\" (UniqueName: \"kubernetes.io/projected/c5594cf9-4b09-4077-9a0a-c6e1e4145792-kube-api-access-zpcfn\") pod \"node-resolver-tbnxq\" (UID: \"c5594cf9-4b09-4077-9a0a-c6e1e4145792\") " pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.374509 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.374478 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hmf4l" Apr 21 15:10:37.380638 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.380611 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod013ca451_549d_478c_941e_0e9994b24c34.slice/crio-f9b94698ae8f31ef5d99dd86780ec12f7392f7ee35abb0f3875badd2fcc1edfc WatchSource:0}: Error finding container f9b94698ae8f31ef5d99dd86780ec12f7392f7ee35abb0f3875badd2fcc1edfc: Status 404 returned error can't find the container with id f9b94698ae8f31ef5d99dd86780ec12f7392f7ee35abb0f3875badd2fcc1edfc Apr 21 15:10:37.387632 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.387610 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9lsjs" Apr 21 15:10:37.393281 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.393252 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc23d088b_b54c_4874_aac7_248c3a09117a.slice/crio-9d255f2737032559dfa76e634de86b142fae5a5eda62b7a7261453ee14ea7352 WatchSource:0}: Error finding container 9d255f2737032559dfa76e634de86b142fae5a5eda62b7a7261453ee14ea7352: Status 404 returned error can't find the container with id 9d255f2737032559dfa76e634de86b142fae5a5eda62b7a7261453ee14ea7352 Apr 21 15:10:37.406546 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.406519 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" Apr 21 15:10:37.410290 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.410219 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:10:37.412732 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.412709 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49309d1e_9f62_4e92_8114_acfac3171dc5.slice/crio-aff41211913fe50cc0acf71569277723ad2a07ab80122d67cf4f1597d24227e4 WatchSource:0}: Error finding container aff41211913fe50cc0acf71569277723ad2a07ab80122d67cf4f1597d24227e4: Status 404 returned error can't find the container with id aff41211913fe50cc0acf71569277723ad2a07ab80122d67cf4f1597d24227e4 Apr 21 15:10:37.416468 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.416441 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd048a9ec_b8d0_42a8_9384_5fe347a8873b.slice/crio-9fc20da0dc3fdb6058ea14da27fdfbd2d6842fd55ff3a01d3c7ed30eadefa01a WatchSource:0}: Error finding container 9fc20da0dc3fdb6058ea14da27fdfbd2d6842fd55ff3a01d3c7ed30eadefa01a: Status 404 returned error can't find the container with id 9fc20da0dc3fdb6058ea14da27fdfbd2d6842fd55ff3a01d3c7ed30eadefa01a Apr 21 15:10:37.422345 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.422328 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:10:37.428192 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.428169 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d06525_c590_4bd2_8237_2f7fa8b1d779.slice/crio-fc4336c6a9190611da68adc38c555efa9f0e3b640adcaf684d6be5cb9f90cae4 WatchSource:0}: Error finding container fc4336c6a9190611da68adc38c555efa9f0e3b640adcaf684d6be5cb9f90cae4: Status 404 returned error can't find the container with id fc4336c6a9190611da68adc38c555efa9f0e3b640adcaf684d6be5cb9f90cae4 Apr 21 15:10:37.431327 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.431300 2544 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:37.435690 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.435666 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" Apr 21 15:10:37.442497 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.442470 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb84b4fe1_338a_43ee_a0aa_7e3cd16d56d6.slice/crio-9de20d5bcfa6f49b39a357e57c59cf65610d5afa4ae33ea64e4d8b1e9a2d2ab0 WatchSource:0}: Error finding container 9de20d5bcfa6f49b39a357e57c59cf65610d5afa4ae33ea64e4d8b1e9a2d2ab0: Status 404 returned error can't find the container with id 9de20d5bcfa6f49b39a357e57c59cf65610d5afa4ae33ea64e4d8b1e9a2d2ab0 Apr 21 15:10:37.452797 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.452776 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" Apr 21 15:10:37.459830 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.459797 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a30a0d0_d747_4359_a316_b1d4215f71f3.slice/crio-867504a941baa7d8eb926d2bec412108fe3264732ce2b982b4e2c879bcb57674 WatchSource:0}: Error finding container 867504a941baa7d8eb926d2bec412108fe3264732ce2b982b4e2c879bcb57674: Status 404 returned error can't find the container with id 867504a941baa7d8eb926d2bec412108fe3264732ce2b982b4e2c879bcb57674 Apr 21 15:10:37.466800 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.466781 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tbnxq" Apr 21 15:10:37.472427 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.472403 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5594cf9_4b09_4077_9a0a_c6e1e4145792.slice/crio-60bc7c019d45661379552da65b308a5149ee306eca419ac1f4a99e33cae61c0f WatchSource:0}: Error finding container 60bc7c019d45661379552da65b308a5149ee306eca419ac1f4a99e33cae61c0f: Status 404 returned error can't find the container with id 60bc7c019d45661379552da65b308a5149ee306eca419ac1f4a99e33cae61c0f Apr 21 15:10:37.483706 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.483686 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rwxvz" Apr 21 15:10:37.489718 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:10:37.489687 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf1eb34_c452_44b9_b6b3_dd0fe6ea5c9b.slice/crio-9187b1fde590d2cb07b73717baa4793aea7206d845197760e8b3c6ab4703b4c3 WatchSource:0}: Error finding container 9187b1fde590d2cb07b73717baa4793aea7206d845197760e8b3c6ab4703b4c3: Status 404 returned error can't find the container with id 9187b1fde590d2cb07b73717baa4793aea7206d845197760e8b3c6ab4703b4c3 Apr 21 15:10:37.674104 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.673954 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:37.674104 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.674083 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.674345 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.674154 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:38.674135092 +0000 UTC m=+3.080543339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.775357 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.775319 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:37.775588 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.775530 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:37.775588 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.775550 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:37.775588 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.775562 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.775762 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:37.775619 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:38.775600652 +0000 UTC m=+3.182008893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.954578 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:37.954307 2544 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:38.100644 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.100546 2544 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:05:37 +0000 UTC" deadline="2027-10-01 20:07:55.660377349 +0000 UTC" Apr 21 15:10:38.100644 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.100586 2544 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12676h57m17.559795146s" Apr 21 15:10:38.143828 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.143587 2544 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:38.205108 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.204492 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:38.205108 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.204612 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:38.216154 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.216093 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" event={"ID":"8a30a0d0-d747-4359-a316-b1d4215f71f3","Type":"ContainerStarted","Data":"867504a941baa7d8eb926d2bec412108fe3264732ce2b982b4e2c879bcb57674"} Apr 21 15:10:38.228955 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.228917 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" event={"ID":"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6","Type":"ContainerStarted","Data":"9de20d5bcfa6f49b39a357e57c59cf65610d5afa4ae33ea64e4d8b1e9a2d2ab0"} Apr 21 15:10:38.233466 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.233291 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" event={"ID":"6fce942d3fd65daddcc4345d05bd271a","Type":"ContainerStarted","Data":"80770f2a3ecb5ca1b25ee150e9397b822da739fd826ccc868b6ed530a059a7f8"} Apr 21 15:10:38.240797 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.240732 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" event={"ID":"d8e31eb06d4e26e4e1d995160527f9b1","Type":"ContainerStarted","Data":"00297935a13685be4909fc678d7c9bcb6529d2eb04499492342a728f53d360d1"} Apr 21 15:10:38.263399 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.263343 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwxvz" event={"ID":"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b","Type":"ContainerStarted","Data":"9187b1fde590d2cb07b73717baa4793aea7206d845197760e8b3c6ab4703b4c3"} Apr 21 15:10:38.266500 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.266322 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tbnxq" event={"ID":"c5594cf9-4b09-4077-9a0a-c6e1e4145792","Type":"ContainerStarted","Data":"60bc7c019d45661379552da65b308a5149ee306eca419ac1f4a99e33cae61c0f"} Apr 21 15:10:38.279188 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.279151 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5s2h5" event={"ID":"67d06525-c590-4bd2-8237-2f7fa8b1d779","Type":"ContainerStarted","Data":"fc4336c6a9190611da68adc38c555efa9f0e3b640adcaf684d6be5cb9f90cae4"} Apr 21 15:10:38.289261 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.288590 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"9fc20da0dc3fdb6058ea14da27fdfbd2d6842fd55ff3a01d3c7ed30eadefa01a"} Apr 21 15:10:38.298981 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.298898 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerStarted","Data":"aff41211913fe50cc0acf71569277723ad2a07ab80122d67cf4f1597d24227e4"} Apr 21 15:10:38.321472 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.321358 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9lsjs" event={"ID":"c23d088b-b54c-4874-aac7-248c3a09117a","Type":"ContainerStarted","Data":"9d255f2737032559dfa76e634de86b142fae5a5eda62b7a7261453ee14ea7352"} Apr 21 15:10:38.333976 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.333930 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hmf4l" event={"ID":"013ca451-549d-478c-941e-0e9994b24c34","Type":"ContainerStarted","Data":"f9b94698ae8f31ef5d99dd86780ec12f7392f7ee35abb0f3875badd2fcc1edfc"} Apr 21 15:10:38.685745 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.685704 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:38.685946 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.685913 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:38.686002 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.685981 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:40.685961204 +0000 UTC m=+5.092369465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:38.786289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:38.786245 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:38.786486 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.786421 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:38.786486 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.786453 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:38.786486 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.786467 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:38.786662 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:38.786530 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:40.786510527 +0000 UTC m=+5.192918790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:39.101145 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:39.101098 2544 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:05:37 +0000 UTC" deadline="2027-09-26 22:36:58.515811729 +0000 UTC" Apr 21 15:10:39.101145 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:39.101141 2544 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12559h26m19.414675046s" Apr 21 15:10:39.201576 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:39.201541 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:39.201776 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:39.201674 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:40.205143 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:40.204642 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:40.205143 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.204782 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:40.703282 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:40.703199 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:40.703487 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.703362 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:40.703487 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.703438 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:44.703417532 +0000 UTC m=+9.109825774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:40.804449 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:40.804409 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:40.804649 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.804624 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:40.804742 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.804656 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:40.804742 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.804670 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:40.804742 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:40.804733 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:44.804714504 +0000 UTC m=+9.211122745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:41.202272 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:41.202151 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:41.202482 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:41.202306 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:42.202491 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:42.202424 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:42.202993 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:42.202569 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:43.202364 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:43.202308 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:43.202568 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:43.202478 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:44.202006 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:44.201971 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:44.202190 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.202102 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:44.740097 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:44.740047 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:44.740697 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.740214 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:44.740697 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.740294 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:52.740272038 +0000 UTC m=+17.146680281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:44.841286 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:44.841246 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:44.841468 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.841423 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:44.841468 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.841447 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:44.841468 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.841457 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:44.841595 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:44.841519 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:52.841501853 +0000 UTC m=+17.247910094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:45.201872 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:45.201834 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:45.202071 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:45.201980 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:46.202281 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:46.202238 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:46.202732 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:46.202348 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:47.201786 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:47.201741 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:47.201953 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:47.201877 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:48.202405 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:48.202367 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:48.202849 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:48.202508 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:49.202266 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:49.202156 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:49.202488 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:49.202296 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:50.201843 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:50.201806 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:50.202039 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:50.201941 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:51.202292 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:51.202258 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:51.202735 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:51.202420 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:52.202633 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:52.202583 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:52.203071 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.202726 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:52.798367 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:52.798323 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:52.798594 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.798519 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:52.798663 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.798604 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.798583001 +0000 UTC m=+33.204991255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:52.899236 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:52.899193 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:52.899429 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.899397 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:52.899429 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.899428 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:52.899539 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.899444 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:52.899539 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:52.899516 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:08.899496904 +0000 UTC m=+33.305905145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:53.201586 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:53.201552 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:53.201788 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:53.201678 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:54.205827 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:54.205796 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:54.206267 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:54.205925 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:55.201460 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:55.201433 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:55.201581 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:55.201538 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:56.205308 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.205075 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:56.206329 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:56.205405 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:56.373159 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.373077 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" event={"ID":"8a30a0d0-d747-4359-a316-b1d4215f71f3","Type":"ContainerStarted","Data":"318cb2df6b930857e2c849ce500485fcd83877b327b7a44533b595db658f4123"} Apr 21 15:10:56.374519 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.374491 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" event={"ID":"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6","Type":"ContainerStarted","Data":"759170f33f65bc19ee7d54a1e1919ac508a921eda3387b1226f4923e5d3771b1"} Apr 21 15:10:56.377186 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.377150 2544 generic.go:358] "Generic (PLEG): container finished" podID="6fce942d3fd65daddcc4345d05bd271a" containerID="f786d622f5cd24bd5aa266aeea271ede59687e0e8c148e9dbd2a0697fc1b2ec1" exitCode=0 Apr 21 15:10:56.377281 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.377210 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" event={"ID":"6fce942d3fd65daddcc4345d05bd271a","Type":"ContainerDied","Data":"f786d622f5cd24bd5aa266aeea271ede59687e0e8c148e9dbd2a0697fc1b2ec1"} Apr 21 15:10:56.379174 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.379142 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" event={"ID":"d8e31eb06d4e26e4e1d995160527f9b1","Type":"ContainerStarted","Data":"1a93839299cab0f1a36ab5be3f7b32f1b56aa68d410f2064765f35d798d1a436"} Apr 21 15:10:56.380454 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.380421 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rwxvz" event={"ID":"dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b","Type":"ContainerStarted","Data":"b0190e1ab284b944cfea401ada1efa8369bf8804fc3d57e677bf42eba088d2b5"} Apr 21 15:10:56.381655 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.381635 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tbnxq" event={"ID":"c5594cf9-4b09-4077-9a0a-c6e1e4145792","Type":"ContainerStarted","Data":"414b433379f6bbe71e45a6545c0311712ec43c6469cc7bab5ad8b684418e40ab"} Apr 21 15:10:56.382892 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.382869 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5s2h5" event={"ID":"67d06525-c590-4bd2-8237-2f7fa8b1d779","Type":"ContainerStarted","Data":"1bc5d0e84ad9792afcc5517d9bfd6474348a1dda884deb16a2bf7a9ffaa40f35"} Apr 21 15:10:56.385048 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.384976 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:10:56.385323 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385293 2544 generic.go:358] "Generic (PLEG): container finished" podID="d048a9ec-b8d0-42a8-9384-5fe347a8873b" containerID="a4b1adbeca4c070e2abff4605418c325ecf31804dc5333cc3fdcb6e0d126ad27" exitCode=1 Apr 21 15:10:56.385391 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385354 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"fe41e1ebffb09ecb721e8f9d9c7f28bf0a3feec6c4624390435c617457bc7fa3"} Apr 21 15:10:56.385430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385385 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"3277e1184e2e13eb59aa5aca52b03d186f360f0e823ba1e18db490ba5f335fc2"} Apr 21 15:10:56.385430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385410 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"066a5e01b42d5b373004a1f5f55ad35def65c88630f0310a8d4f46b7963aaa11"} Apr 21 15:10:56.385430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385424 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"a9cdc06ad52e4fcc85704b3929860e51af62b47671468c4ec83f5255318a26b1"} Apr 21 15:10:56.385559 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385436 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerDied","Data":"a4b1adbeca4c070e2abff4605418c325ecf31804dc5333cc3fdcb6e0d126ad27"} Apr 21 15:10:56.385559 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.385450 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"985bc60d63f99ae6f79bca613b26c8b6ebb096e627f064604f3cb4b475d9afd4"} Apr 21 15:10:56.386682 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.386664 2544 generic.go:358] "Generic (PLEG): container finished" podID="49309d1e-9f62-4e92-8114-acfac3171dc5" containerID="069c38bb17f5e4bff8992c7b312c49dc1eca39a0562431537ef2e13e184b8bc9" exitCode=0 Apr 21 15:10:56.386813 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.386717 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerDied","Data":"069c38bb17f5e4bff8992c7b312c49dc1eca39a0562431537ef2e13e184b8bc9"} Apr 21 15:10:56.388058 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.388041 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9lsjs" event={"ID":"c23d088b-b54c-4874-aac7-248c3a09117a","Type":"ContainerStarted","Data":"4a14c3a1a2db7a4d3c09ecc45087c9654fc467585e1780676a81004e892f0819"} Apr 21 15:10:56.401334 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.401280 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nv5tm" podStartSLOduration=2.500212435 podStartE2EDuration="20.401265428s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.461438957 +0000 UTC m=+1.867847199" lastFinishedPulling="2026-04-21 15:10:55.362491948 +0000 UTC m=+19.768900192" observedRunningTime="2026-04-21 15:10:56.400745712 +0000 UTC m=+20.807153972" watchObservedRunningTime="2026-04-21 15:10:56.401265428 +0000 UTC m=+20.807673688" Apr 21 15:10:56.429556 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.429505 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rwxvz" podStartSLOduration=2.444343299 podStartE2EDuration="20.429491586s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.491160091 +0000 UTC m=+1.897568329" lastFinishedPulling="2026-04-21 15:10:55.476308373 +0000 UTC m=+19.882716616" observedRunningTime="2026-04-21 15:10:56.429075737 +0000 UTC m=+20.835483994" watchObservedRunningTime="2026-04-21 15:10:56.429491586 +0000 UTC m=+20.835899846" Apr 21 15:10:56.479068 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.479023 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-168.ec2.internal" podStartSLOduration=19.479006667 podStartE2EDuration="19.479006667s" podCreationTimestamp="2026-04-21 15:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:56.44834208 +0000 UTC m=+20.854750351" watchObservedRunningTime="2026-04-21 15:10:56.479006667 +0000 UTC m=+20.885414927" Apr 21 15:10:56.479209 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.479089 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9lsjs" podStartSLOduration=2.51334453 podStartE2EDuration="20.479084673s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.394894512 +0000 UTC m=+1.801302750" lastFinishedPulling="2026-04-21 15:10:55.360634642 +0000 UTC m=+19.767042893" observedRunningTime="2026-04-21 15:10:56.478615664 +0000 UTC m=+20.885023924" watchObservedRunningTime="2026-04-21 15:10:56.479084673 +0000 UTC m=+20.885492933" Apr 21 15:10:56.584467 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.584408 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5s2h5" podStartSLOduration=2.652831621 podStartE2EDuration="20.584387488s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.429690176 +0000 UTC m=+1.836098414" lastFinishedPulling="2026-04-21 15:10:55.361246028 +0000 UTC m=+19.767654281" observedRunningTime="2026-04-21 15:10:56.583820795 +0000 UTC m=+20.990229055" watchObservedRunningTime="2026-04-21 15:10:56.584387488 +0000 UTC m=+20.990795752" Apr 21 15:10:56.643303 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:56.643250 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tbnxq" podStartSLOduration=2.756978018 podStartE2EDuration="20.643232237s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.474408023 +0000 UTC m=+1.880816262" lastFinishedPulling="2026-04-21 15:10:55.360662241 +0000 UTC m=+19.767070481" observedRunningTime="2026-04-21 15:10:56.612820709 +0000 UTC m=+21.019228970" watchObservedRunningTime="2026-04-21 15:10:56.643232237 +0000 UTC m=+21.049640519" Apr 21 15:10:57.167278 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.167064 2544 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:10:57.202228 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.202196 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:57.202412 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:57.202327 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:10:57.394816 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.394776 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hmf4l" event={"ID":"013ca451-549d-478c-941e-0e9994b24c34","Type":"ContainerStarted","Data":"c64a00c5347f7d5e1ba391f93fa1109a426510c663a43fd0b7f2dc00ebe33ec1"} Apr 21 15:10:57.396855 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.396822 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" event={"ID":"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6","Type":"ContainerStarted","Data":"c8bb5d1cf1377eebdd1bd46f384eab6a906df3a1fd64d6c3d9e2394509def11b"} Apr 21 15:10:57.398598 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.398569 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" event={"ID":"6fce942d3fd65daddcc4345d05bd271a","Type":"ContainerStarted","Data":"97adc57b8829c71257113296e66d1d35d66dafb0ccd24beff8ac121b4d2b402d"} Apr 21 15:10:57.413544 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.413440 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hmf4l" podStartSLOduration=3.434337599 podStartE2EDuration="21.413427166s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.382214616 +0000 UTC m=+1.788622853" lastFinishedPulling="2026-04-21 15:10:55.361304168 +0000 UTC m=+19.767712420" observedRunningTime="2026-04-21 15:10:57.413336179 +0000 UTC m=+21.819744440" watchObservedRunningTime="2026-04-21 15:10:57.413427166 +0000 UTC m=+21.819835425" Apr 21 15:10:57.429711 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:57.429629 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-168.ec2.internal" podStartSLOduration=20.429608569 podStartE2EDuration="20.429608569s" podCreationTimestamp="2026-04-21 15:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:57.429449207 +0000 UTC m=+21.835857507" watchObservedRunningTime="2026-04-21 15:10:57.429608569 +0000 UTC m=+21.836016828" Apr 21 15:10:58.138418 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.138285 2544 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:10:57.16727411Z","UUID":"53f48ab9-74da-40e8-91cb-392ea04b17a2","Handler":null,"Name":"","Endpoint":""} Apr 21 15:10:58.143320 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.143287 2544 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:10:58.143320 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.143322 2544 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:10:58.201698 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.201625 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:10:58.201869 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:58.201784 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:10:58.402726 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.402684 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" event={"ID":"b84b4fe1-338a-43ee-a0aa-7e3cd16d56d6","Type":"ContainerStarted","Data":"22ba762cdba9bd86ce51b7433029407ad24a97ef12ab8d2046fc78463d81222a"} Apr 21 15:10:58.406253 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.406226 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:10:58.406728 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.406697 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"72f0b200f18d98c9a8301dd38c96e18fb6c548cf90be7c240e8890e9a81ec596"} Apr 21 15:10:58.422707 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:58.422625 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dhk4s" podStartSLOduration=1.775850438 podStartE2EDuration="22.42260402s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.444333987 +0000 UTC m=+1.850742225" lastFinishedPulling="2026-04-21 15:10:58.091087568 +0000 UTC m=+22.497495807" observedRunningTime="2026-04-21 15:10:58.42202449 +0000 UTC m=+22.828432751" watchObservedRunningTime="2026-04-21 15:10:58.42260402 +0000 UTC m=+22.829012281" Apr 21 15:10:59.202538 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:10:59.202495 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:10:59.202732 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:10:59.202623 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:11:00.204733 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.204704 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:00.205229 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:00.204835 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:11:00.416536 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.416139 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:11:00.416941 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.416730 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"8b435d3f74635de9b77024336bc23661c0bebb88bac94714137562a9ce2832d5"} Apr 21 15:11:00.417306 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.417196 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:11:00.417306 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.417223 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:11:00.417576 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.417337 2544 scope.go:117] "RemoveContainer" containerID="a4b1adbeca4c070e2abff4605418c325ecf31804dc5333cc3fdcb6e0d126ad27" Apr 21 15:11:00.434506 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.434434 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:11:00.567242 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.567202 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:11:00.567831 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:00.567806 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:11:01.201921 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.201890 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:01.202101 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:01.201998 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:11:01.423489 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.423458 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:11:01.423995 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.423826 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" event={"ID":"d048a9ec-b8d0-42a8-9384-5fe347a8873b","Type":"ContainerStarted","Data":"cce05536332f373859a9fb9746ea40518445bb2a6df35c552d6d1c55eabb844f"} Apr 21 15:11:01.424192 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.424169 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:11:01.425685 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.425657 2544 generic.go:358] "Generic (PLEG): container finished" podID="49309d1e-9f62-4e92-8114-acfac3171dc5" containerID="c43e370e52385afb8302da13499ff031c67b1de33f6def2252ca7429cd62bc38" exitCode=0 Apr 21 15:11:01.425808 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.425765 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerDied","Data":"c43e370e52385afb8302da13499ff031c67b1de33f6def2252ca7429cd62bc38"} Apr 21 15:11:01.425996 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.425979 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:11:01.426462 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.426443 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5s2h5" Apr 21 15:11:01.441666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.441638 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:11:01.461377 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:01.461284 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" podStartSLOduration=7.477516914 podStartE2EDuration="25.461270276s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.417960162 +0000 UTC m=+1.824368399" lastFinishedPulling="2026-04-21 15:10:55.401713518 +0000 UTC m=+19.808121761" observedRunningTime="2026-04-21 15:11:01.459568611 +0000 UTC m=+25.865976870" watchObservedRunningTime="2026-04-21 15:11:01.461270276 +0000 UTC m=+25.867678536" Apr 21 15:11:02.202674 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.202519 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:02.202818 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:02.202761 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:11:02.402900 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.402868 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sm5q9"] Apr 21 15:11:02.404443 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.404419 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-84hkv"] Apr 21 15:11:02.404547 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.404535 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:02.404642 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:02.404622 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:11:02.429587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.429545 2544 generic.go:358] "Generic (PLEG): container finished" podID="49309d1e-9f62-4e92-8114-acfac3171dc5" containerID="32db8e3e1fc069297e069396ad2cd2af3c8d4cfd535e16b3e42ee419614af801" exitCode=0 Apr 21 15:11:02.430164 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.429638 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:02.430164 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:02.429652 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerDied","Data":"32db8e3e1fc069297e069396ad2cd2af3c8d4cfd535e16b3e42ee419614af801"} Apr 21 15:11:02.430164 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:02.429810 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:11:03.434260 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:03.434151 2544 generic.go:358] "Generic (PLEG): container finished" podID="49309d1e-9f62-4e92-8114-acfac3171dc5" containerID="04b38b463142af0760f724eafa9e34a4b651017d831ced2137df51441b10e080" exitCode=0 Apr 21 15:11:03.434260 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:03.434209 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerDied","Data":"04b38b463142af0760f724eafa9e34a4b651017d831ced2137df51441b10e080"} Apr 21 15:11:04.202386 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:04.202305 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:04.202559 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:04.202477 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:11:04.203147 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:04.202335 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:04.203283 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:04.203261 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:11:06.203091 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:06.203051 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:06.203841 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:06.203168 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:11:06.203841 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:06.203242 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:06.203841 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:06.203407 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:11:08.202002 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.201794 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:08.202438 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.201826 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:08.202438 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.202091 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sm5q9" podUID="0bc3eca4-5765-4b43-a4f4-51b55c9f8d88" Apr 21 15:11:08.202438 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.202190 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:11:08.367449 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.367412 2544 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-168.ec2.internal" event="NodeReady" Apr 21 15:11:08.367598 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.367553 2544 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:11:08.419492 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.419456 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dwdcj"] Apr 21 15:11:08.455141 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.455055 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q5k4h"] Apr 21 15:11:08.455290 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.455238 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.458203 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.458167 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:11:08.458203 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.458183 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-76666\"" Apr 21 15:11:08.458436 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.458175 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:11:08.469557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.469526 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dwdcj"] Apr 21 15:11:08.469557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.469558 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q5k4h"] Apr 21 15:11:08.469714 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.469672 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:08.472777 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.472704 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:11:08.474453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.473437 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l7xgv\"" Apr 21 15:11:08.474453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.473586 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:11:08.476534 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.474993 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:11:08.602253 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.602221 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k"] Apr 21 15:11:08.617873 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.617841 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k"] Apr 21 15:11:08.618037 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.617967 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.618100 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.618046 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18eaf2c1-6533-4b15-a759-c0e039abbd8f-config-volume\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.618100 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.618074 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7tn9\" (UniqueName: \"kubernetes.io/projected/18eaf2c1-6533-4b15-a759-c0e039abbd8f-kube-api-access-h7tn9\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.618197 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.618122 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.618252 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.618183 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18eaf2c1-6533-4b15-a759-c0e039abbd8f-tmp-dir\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.618252 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.618226 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:08.618357 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.618294 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddnd\" (UniqueName: \"kubernetes.io/projected/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-kube-api-access-tddnd\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:08.624136 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.623625 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 15:11:08.624136 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.623645 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 15:11:08.624136 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.623849 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 15:11:08.624136 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.623961 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 15:11:08.624442 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.624144 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-27whq\"" Apr 21 15:11:08.719678 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719576 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57a556a8-7dd1-4d6a-bba8-4b1896f9915c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5fd7bb9459-79c7k\" (UID: \"57a556a8-7dd1-4d6a-bba8-4b1896f9915c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.719678 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719664 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719691 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18eaf2c1-6533-4b15-a759-c0e039abbd8f-tmp-dir\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719718 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719744 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tddnd\" (UniqueName: \"kubernetes.io/projected/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-kube-api-access-tddnd\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719821 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng2rw\" (UniqueName: \"kubernetes.io/projected/57a556a8-7dd1-4d6a-bba8-4b1896f9915c-kube-api-access-ng2rw\") pod \"managed-serviceaccount-addon-agent-5fd7bb9459-79c7k\" (UID: \"57a556a8-7dd1-4d6a-bba8-4b1896f9915c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.719840 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.719857 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719866 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18eaf2c1-6533-4b15-a759-c0e039abbd8f-config-volume\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.719919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.719894 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7tn9\" (UniqueName: \"kubernetes.io/projected/18eaf2c1-6533-4b15-a759-c0e039abbd8f-kube-api-access-h7tn9\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.720241 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.719944 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.21991734 +0000 UTC m=+33.626325579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:08.720241 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.719972 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.219962403 +0000 UTC m=+33.626370645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:08.720241 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.720078 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18eaf2c1-6533-4b15-a759-c0e039abbd8f-tmp-dir\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.720515 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.720489 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18eaf2c1-6533-4b15-a759-c0e039abbd8f-config-volume\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.733455 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.733427 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7tn9\" (UniqueName: \"kubernetes.io/projected/18eaf2c1-6533-4b15-a759-c0e039abbd8f-kube-api-access-h7tn9\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:08.733713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.733692 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddnd\" (UniqueName: \"kubernetes.io/projected/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-kube-api-access-tddnd\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:08.821256 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.820945 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57a556a8-7dd1-4d6a-bba8-4b1896f9915c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5fd7bb9459-79c7k\" (UID: \"57a556a8-7dd1-4d6a-bba8-4b1896f9915c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.821256 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.821055 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:08.821256 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.821086 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng2rw\" (UniqueName: \"kubernetes.io/projected/57a556a8-7dd1-4d6a-bba8-4b1896f9915c-kube-api-access-ng2rw\") pod \"managed-serviceaccount-addon-agent-5fd7bb9459-79c7k\" (UID: \"57a556a8-7dd1-4d6a-bba8-4b1896f9915c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.821525 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.821502 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:11:08.821593 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.821583 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:40.821561771 +0000 UTC m=+65.227970010 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:11:08.823947 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.823923 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/57a556a8-7dd1-4d6a-bba8-4b1896f9915c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5fd7bb9459-79c7k\" (UID: \"57a556a8-7dd1-4d6a-bba8-4b1896f9915c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.830141 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.830119 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng2rw\" (UniqueName: \"kubernetes.io/projected/57a556a8-7dd1-4d6a-bba8-4b1896f9915c-kube-api-access-ng2rw\") pod \"managed-serviceaccount-addon-agent-5fd7bb9459-79c7k\" (UID: \"57a556a8-7dd1-4d6a-bba8-4b1896f9915c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:08.921868 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.921830 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:08.922016 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.921991 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:11:08.922016 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.922013 2544 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:11:08.922100 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.922026 2544 projected.go:194] Error preparing data for projected volume kube-api-access-8gx6f for pod openshift-network-diagnostics/network-check-target-sm5q9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:11:08.922100 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:08.922087 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f podName:0bc3eca4-5765-4b43-a4f4-51b55c9f8d88 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:40.922070933 +0000 UTC m=+65.328479171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8gx6f" (UniqueName: "kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f") pod "network-check-target-sm5q9" (UID: "0bc3eca4-5765-4b43-a4f4-51b55c9f8d88") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:11:08.946042 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:08.946006 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" Apr 21 15:11:09.134924 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:09.134897 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k"] Apr 21 15:11:09.195368 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:11:09.195329 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a556a8_7dd1_4d6a_bba8_4b1896f9915c.slice/crio-8d278fde9cd3353f104a64b2f55884a32baf3e7923abc7d59ccfefb5ae834b5f WatchSource:0}: Error finding container 8d278fde9cd3353f104a64b2f55884a32baf3e7923abc7d59ccfefb5ae834b5f: Status 404 returned error can't find the container with id 8d278fde9cd3353f104a64b2f55884a32baf3e7923abc7d59ccfefb5ae834b5f Apr 21 15:11:09.224419 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:09.224390 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:09.224419 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:09.224423 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:09.224870 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:09.224531 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:09.224870 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:09.224539 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:09.224870 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:09.224576 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:10.224563053 +0000 UTC m=+34.630971291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:09.224870 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:09.224620 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:11:10.224605538 +0000 UTC m=+34.631013779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:09.449079 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:09.449038 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerStarted","Data":"5dc7703ee2fb0f53c3f32d46628daa6ae9775626f4a2220b7fde872760d56e15"} Apr 21 15:11:09.450149 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:09.450126 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" event={"ID":"57a556a8-7dd1-4d6a-bba8-4b1896f9915c","Type":"ContainerStarted","Data":"8d278fde9cd3353f104a64b2f55884a32baf3e7923abc7d59ccfefb5ae834b5f"} Apr 21 15:11:10.202598 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.202402 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:10.202864 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.202479 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:10.207422 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.207399 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:11:10.208421 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.208398 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:11:10.209701 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.209679 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zqtzq\"" Apr 21 15:11:10.209788 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.209705 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4fs5d\"" Apr 21 15:11:10.210610 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.210593 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:11:10.231200 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.231162 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:10.231200 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.231202 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:10.231638 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:10.231310 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:10.231638 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:10.231370 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:11:12.231355018 +0000 UTC m=+36.637763259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:10.231638 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:10.231310 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:10.231638 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:10.231439 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:12.231427318 +0000 UTC m=+36.637835559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:10.454657 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.454563 2544 generic.go:358] "Generic (PLEG): container finished" podID="49309d1e-9f62-4e92-8114-acfac3171dc5" containerID="5dc7703ee2fb0f53c3f32d46628daa6ae9775626f4a2220b7fde872760d56e15" exitCode=0 Apr 21 15:11:10.454657 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:10.454622 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerDied","Data":"5dc7703ee2fb0f53c3f32d46628daa6ae9775626f4a2220b7fde872760d56e15"} Apr 21 15:11:11.460779 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:11.460721 2544 generic.go:358] "Generic (PLEG): container finished" podID="49309d1e-9f62-4e92-8114-acfac3171dc5" containerID="195ca880770a4c127bdea10198e08229e30ca04adabf9869c6240c1b5cc0b437" exitCode=0 Apr 21 15:11:11.461148 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:11.460797 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerDied","Data":"195ca880770a4c127bdea10198e08229e30ca04adabf9869c6240c1b5cc0b437"} Apr 21 15:11:12.250323 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:12.250234 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:12.250323 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:12.250287 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:12.250585 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:12.250403 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:12.250585 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:12.250429 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:12.250585 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:12.250495 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:11:16.250476267 +0000 UTC m=+40.656884505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:12.250585 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:12.250516 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:16.25050706 +0000 UTC m=+40.656915298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:13.466328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:13.466293 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" event={"ID":"57a556a8-7dd1-4d6a-bba8-4b1896f9915c","Type":"ContainerStarted","Data":"c192e00927820c4b399efb83b447cad578e9b17a91483f4d899b4b31f10afb22"} Apr 21 15:11:13.469173 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:13.469144 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" event={"ID":"49309d1e-9f62-4e92-8114-acfac3171dc5","Type":"ContainerStarted","Data":"942820cfcfe17f7247465b28b60b4635a147a5a7ac50be726143ee215a29e7a6"} Apr 21 15:11:13.489919 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:13.489863 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" podStartSLOduration=1.9429980119999999 podStartE2EDuration="5.489841843s" podCreationTimestamp="2026-04-21 15:11:08 +0000 UTC" firstStartedPulling="2026-04-21 15:11:09.207680517 +0000 UTC m=+33.614088756" lastFinishedPulling="2026-04-21 15:11:12.754524347 +0000 UTC m=+37.160932587" observedRunningTime="2026-04-21 15:11:13.489801399 +0000 UTC m=+37.896209660" watchObservedRunningTime="2026-04-21 15:11:13.489841843 +0000 UTC m=+37.896250107" Apr 21 15:11:13.515931 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:13.515878 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r8gk4" podStartSLOduration=5.699084534 podStartE2EDuration="37.515861417s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:10:37.414545395 +0000 UTC m=+1.820953633" lastFinishedPulling="2026-04-21 15:11:09.231322275 +0000 UTC m=+33.637730516" observedRunningTime="2026-04-21 15:11:13.515400261 +0000 UTC m=+37.921808533" watchObservedRunningTime="2026-04-21 15:11:13.515861417 +0000 UTC m=+37.922269698" Apr 21 15:11:16.279882 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:16.279844 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:16.279882 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:16.279885 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:16.280335 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:16.280009 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:16.280335 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:16.280073 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:11:24.280057474 +0000 UTC m=+48.686465712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:16.280335 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:16.280016 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:16.280335 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:16.280144 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:24.280130456 +0000 UTC m=+48.686538693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:24.335259 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:24.335213 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:24.335259 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:24.335254 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:24.335673 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:24.335364 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:24.335673 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:24.335368 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:24.335673 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:24.335430 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:40.335413922 +0000 UTC m=+64.741822160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:24.335673 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:24.335442 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:11:40.335436602 +0000 UTC m=+64.741844840 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:33.444581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:33.444550 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hj57q" Apr 21 15:11:40.348866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.348818 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:11:40.348866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.348867 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:11:40.349352 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:40.348995 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:40.349352 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:40.349069 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:12:12.349050104 +0000 UTC m=+96.755458342 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:11:40.349352 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:40.348997 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:40.349352 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:40.349162 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:12.349141439 +0000 UTC m=+96.755549698 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:11:40.852639 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.852600 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:11:40.855839 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.855813 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:11:40.863534 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:40.863511 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:11:40.863657 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:11:40.863590 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:44.863568925 +0000 UTC m=+129.269977162 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : secret "metrics-daemon-secret" not found Apr 21 15:11:40.953388 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.953346 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:40.956452 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.956431 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:11:40.967061 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.967037 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:11:40.978347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:40.978319 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gx6f\" (UniqueName: \"kubernetes.io/projected/0bc3eca4-5765-4b43-a4f4-51b55c9f8d88-kube-api-access-8gx6f\") pod \"network-check-target-sm5q9\" (UID: \"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88\") " pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:41.116567 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:41.116492 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4fs5d\"" Apr 21 15:11:41.123321 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:41.123292 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:41.266651 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:41.266610 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sm5q9"] Apr 21 15:11:41.269709 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:11:41.269679 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc3eca4_5765_4b43_a4f4_51b55c9f8d88.slice/crio-a2a631236898c1f6d58015f1e96a25bd4406e4f86ff9cd4b70ffc72f00c68d1f WatchSource:0}: Error finding container a2a631236898c1f6d58015f1e96a25bd4406e4f86ff9cd4b70ffc72f00c68d1f: Status 404 returned error can't find the container with id a2a631236898c1f6d58015f1e96a25bd4406e4f86ff9cd4b70ffc72f00c68d1f Apr 21 15:11:41.525045 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:41.524956 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sm5q9" event={"ID":"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88","Type":"ContainerStarted","Data":"a2a631236898c1f6d58015f1e96a25bd4406e4f86ff9cd4b70ffc72f00c68d1f"} Apr 21 15:11:44.532566 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:44.532482 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sm5q9" event={"ID":"0bc3eca4-5765-4b43-a4f4-51b55c9f8d88","Type":"ContainerStarted","Data":"099d781a5f1714bcaee857c28c61f5044920554d4dc3c22ca43521f83e2cdcea"} Apr 21 15:11:44.533027 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:44.532621 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:11:44.549236 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:11:44.549112 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sm5q9" podStartSLOduration=65.697651928 podStartE2EDuration="1m8.549093712s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:11:41.271534235 +0000 UTC m=+65.677942487" lastFinishedPulling="2026-04-21 15:11:44.122976027 +0000 UTC m=+68.529384271" observedRunningTime="2026-04-21 15:11:44.548424176 +0000 UTC m=+68.954832433" watchObservedRunningTime="2026-04-21 15:11:44.549093712 +0000 UTC m=+68.955501975" Apr 21 15:12:12.378342 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:12.378305 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:12:12.378342 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:12.378345 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:12:12.378776 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:12.378434 2544 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:12:12.378776 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:12.378438 2544 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:12:12.378776 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:12.378502 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert podName:a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:16.378487826 +0000 UTC m=+160.784896068 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert") pod "ingress-canary-q5k4h" (UID: "a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059") : secret "canary-serving-cert" not found Apr 21 15:12:12.378776 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:12.378516 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls podName:18eaf2c1-6533-4b15-a759-c0e039abbd8f nodeName:}" failed. No retries permitted until 2026-04-21 15:13:16.378510946 +0000 UTC m=+160.784919184 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls") pod "dns-default-dwdcj" (UID: "18eaf2c1-6533-4b15-a759-c0e039abbd8f") : secret "dns-default-metrics-tls" not found Apr 21 15:12:15.536671 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:15.536637 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sm5q9" Apr 21 15:12:39.567672 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.567636 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79765d5944-b66kf"] Apr 21 15:12:39.570462 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.570445 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.573728 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.573704 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 15:12:39.575109 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.575081 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 15:12:39.575289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.575096 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 15:12:39.575289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.575118 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:12:39.575897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.575875 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 15:12:39.576023 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.575932 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:12:39.576127 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.576112 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cpsln\"" Apr 21 15:12:39.600175 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.600147 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79765d5944-b66kf"] Apr 21 15:12:39.673344 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.673309 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-n8zqd"] Apr 21 15:12:39.673804 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.673785 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.673871 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.673827 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.673947 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.673929 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-default-certificate\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.674000 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.673964 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-stats-auth\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.674055 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.673997 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpmmz\" (UniqueName: \"kubernetes.io/projected/75433e2f-ffe1-4177-857f-37e16ba4a802-kube-api-access-bpmmz\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.676011 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.675996 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.680128 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.680098 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"openshift-insights-serving-cert\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" type="*v1.Secret" Apr 21 15:12:39.680128 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.680096 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"operator-dockercfg-x76cb\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"operator-dockercfg-x76cb\"" type="*v1.Secret" Apr 21 15:12:39.680361 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.680342 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" type="*v1.ConfigMap" Apr 21 15:12:39.680640 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.680620 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" type="*v1.ConfigMap" Apr 21 15:12:39.680719 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.680636 2544 status_manager.go:895] "Failed to get status for pod" podUID="f2e49b4d-d05c-4693-9ac9-190627c56f55" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" err="pods \"insights-operator-585dfdc468-n8zqd\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" Apr 21 15:12:39.681521 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.681492 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 21 15:12:39.685647 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.685624 2544 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-137-168.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-137-168.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 21 15:12:39.697308 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.697282 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-n8zqd"] Apr 21 15:12:39.757157 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.757124 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d985b498b-8lcdx"] Apr 21 15:12:39.759858 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.759842 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.765379 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.765343 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:12:39.765557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.765380 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:12:39.765557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.765463 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gcqst\"" Apr 21 15:12:39.765557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.765349 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:12:39.774282 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774254 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.774430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774297 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.774430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774320 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2e49b4d-d05c-4693-9ac9-190627c56f55-tmp\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.774430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774372 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e49b4d-d05c-4693-9ac9-190627c56f55-serving-cert\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.774430 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774399 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-default-certificate\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.774430 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.774417 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:40.274396744 +0000 UTC m=+124.680805004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : configmap references non-existent config key: service-ca.crt Apr 21 15:12:39.774729 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774484 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-stats-auth\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.774729 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774514 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpmmz\" (UniqueName: \"kubernetes.io/projected/75433e2f-ffe1-4177-857f-37e16ba4a802-kube-api-access-bpmmz\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.774729 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774544 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2e49b4d-d05c-4693-9ac9-190627c56f55-snapshots\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.774729 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774568 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-service-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.774729 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774657 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.774729 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.774708 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jfd\" (UniqueName: \"kubernetes.io/projected/f2e49b4d-d05c-4693-9ac9-190627c56f55-kube-api-access-t2jfd\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.774968 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.774781 2544 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:12:39.774968 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.774827 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:40.274814347 +0000 UTC m=+124.681222585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : secret "router-metrics-certs-default" not found Apr 21 15:12:39.776785 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.776740 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-default-certificate\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.776881 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.776806 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-stats-auth\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.781887 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.781864 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:12:39.793627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.793596 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d985b498b-8lcdx"] Apr 21 15:12:39.796403 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.796372 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpmmz\" (UniqueName: \"kubernetes.io/projected/75433e2f-ffe1-4177-857f-37e16ba4a802-kube-api-access-bpmmz\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:39.841031 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.840951 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b"] Apr 21 15:12:39.843848 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.843831 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" Apr 21 15:12:39.847676 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.847658 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-bv7vl\"" Apr 21 15:12:39.849970 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.849948 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s"] Apr 21 15:12:39.852763 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.852733 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:39.857186 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.857165 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zn6rz\"" Apr 21 15:12:39.857529 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.857513 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 15:12:39.857859 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.857844 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:12:39.860566 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.860549 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 15:12:39.863068 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.863045 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b"] Apr 21 15:12:39.874483 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.874460 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s"] Apr 21 15:12:39.875251 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875225 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjj5p\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-kube-api-access-fjj5p\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875388 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875288 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jfd\" (UniqueName: \"kubernetes.io/projected/f2e49b4d-d05c-4693-9ac9-190627c56f55-kube-api-access-t2jfd\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.875388 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875349 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-image-registry-private-configuration\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875385 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-certificates\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875415 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-bound-sa-token\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875452 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875485 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4e6e922-a37d-437a-ac4e-43b6b4333c85-ca-trust-extracted\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875508 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-trusted-ca\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875562 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2e49b4d-d05c-4693-9ac9-190627c56f55-tmp\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.875587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875588 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875913 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875654 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e49b4d-d05c-4693-9ac9-190627c56f55-serving-cert\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.875913 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875690 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-installation-pull-secrets\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.875913 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875824 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2e49b4d-d05c-4693-9ac9-190627c56f55-snapshots\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.875913 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875864 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-service-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.876082 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.875986 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2e49b4d-d05c-4693-9ac9-190627c56f55-tmp\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.876312 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.876291 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f2e49b4d-d05c-4693-9ac9-190627c56f55-snapshots\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:39.977035 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.976981 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlcsg\" (UniqueName: \"kubernetes.io/projected/a5a141dd-cdcf-4c40-ba20-236924041f30-kube-api-access-qlcsg\") pod \"network-check-source-8894fc9bd-lnq2b\" (UID: \"a5a141dd-cdcf-4c40-ba20-236924041f30\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" Apr 21 15:12:39.977035 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977033 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-certificates\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977051 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7qf\" (UniqueName: \"kubernetes.io/projected/eb57911b-462f-4738-9ef6-dcc01747de8f-kube-api-access-gf7qf\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977110 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4e6e922-a37d-437a-ac4e-43b6b4333c85-ca-trust-extracted\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977142 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-trusted-ca\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977185 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-installation-pull-secrets\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977225 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjj5p\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-kube-api-access-fjj5p\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977257 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-bound-sa-token\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977289 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977288 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-image-registry-private-configuration\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977671 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977321 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:39.977671 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977358 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977671 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.977486 2544 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:12:39.977671 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.977500 2544 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d985b498b-8lcdx: secret "image-registry-tls" not found Apr 21 15:12:39.977671 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.977514 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4e6e922-a37d-437a-ac4e-43b6b4333c85-ca-trust-extracted\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.977671 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:39.977550 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls podName:f4e6e922-a37d-437a-ac4e-43b6b4333c85 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:40.4775329 +0000 UTC m=+124.883941145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls") pod "image-registry-6d985b498b-8lcdx" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85") : secret "image-registry-tls" not found Apr 21 15:12:39.978062 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.978043 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-trusted-ca\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.978190 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.978176 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-certificates\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.979858 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.979841 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-installation-pull-secrets\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:39.979947 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:39.979921 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-image-registry-private-configuration\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:40.011615 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.011583 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-bound-sa-token\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:40.011766 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.011736 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjj5p\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-kube-api-access-fjj5p\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:40.078310 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.078269 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:40.078470 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.078332 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlcsg\" (UniqueName: \"kubernetes.io/projected/a5a141dd-cdcf-4c40-ba20-236924041f30-kube-api-access-qlcsg\") pod \"network-check-source-8894fc9bd-lnq2b\" (UID: \"a5a141dd-cdcf-4c40-ba20-236924041f30\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" Apr 21 15:12:40.078470 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.078424 2544 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:12:40.078547 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.078484 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls podName:eb57911b-462f-4738-9ef6-dcc01747de8f nodeName:}" failed. No retries permitted until 2026-04-21 15:12:40.578468432 +0000 UTC m=+124.984876669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6q7s" (UID: "eb57911b-462f-4738-9ef6-dcc01747de8f") : secret "samples-operator-tls" not found Apr 21 15:12:40.078547 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.078511 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7qf\" (UniqueName: \"kubernetes.io/projected/eb57911b-462f-4738-9ef6-dcc01747de8f-kube-api-access-gf7qf\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:40.094614 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.094559 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlcsg\" (UniqueName: \"kubernetes.io/projected/a5a141dd-cdcf-4c40-ba20-236924041f30-kube-api-access-qlcsg\") pod \"network-check-source-8894fc9bd-lnq2b\" (UID: \"a5a141dd-cdcf-4c40-ba20-236924041f30\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" Apr 21 15:12:40.100239 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.100217 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7qf\" (UniqueName: \"kubernetes.io/projected/eb57911b-462f-4738-9ef6-dcc01747de8f-kube-api-access-gf7qf\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:40.153371 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.153327 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" Apr 21 15:12:40.272771 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.272721 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b"] Apr 21 15:12:40.275925 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:12:40.275896 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5a141dd_cdcf_4c40_ba20_236924041f30.slice/crio-a6af6d3efa076003d9b486266895158dd8f369f7e5ca46021a07604304b54f3f WatchSource:0}: Error finding container a6af6d3efa076003d9b486266895158dd8f369f7e5ca46021a07604304b54f3f: Status 404 returned error can't find the container with id a6af6d3efa076003d9b486266895158dd8f369f7e5ca46021a07604304b54f3f Apr 21 15:12:40.280211 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.280190 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:40.280283 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.280233 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:40.280367 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.280346 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.280327514 +0000 UTC m=+125.686735757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : configmap references non-existent config key: service-ca.crt Apr 21 15:12:40.280474 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.280403 2544 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:12:40.280474 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.280465 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.280451956 +0000 UTC m=+125.686860198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : secret "router-metrics-certs-default" not found Apr 21 15:12:40.482300 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.482208 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:40.482467 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.482356 2544 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:12:40.482467 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.482376 2544 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d985b498b-8lcdx: secret "image-registry-tls" not found Apr 21 15:12:40.482467 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.482427 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls podName:f4e6e922-a37d-437a-ac4e-43b6b4333c85 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.482412459 +0000 UTC m=+125.888820701 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls") pod "image-registry-6d985b498b-8lcdx" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85") : secret "image-registry-tls" not found Apr 21 15:12:40.583360 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.583322 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:40.583731 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.583475 2544 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:12:40.583731 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.583539 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls podName:eb57911b-462f-4738-9ef6-dcc01747de8f nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.58352335 +0000 UTC m=+125.989931607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6q7s" (UID: "eb57911b-462f-4738-9ef6-dcc01747de8f") : secret "samples-operator-tls" not found Apr 21 15:12:40.642902 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.642863 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" event={"ID":"a5a141dd-cdcf-4c40-ba20-236924041f30","Type":"ContainerStarted","Data":"78e399bc2249f3d2e363a86e554f596f6357533d4917a18aa6360b6894c94922"} Apr 21 15:12:40.643055 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.642907 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" event={"ID":"a5a141dd-cdcf-4c40-ba20-236924041f30","Type":"ContainerStarted","Data":"a6af6d3efa076003d9b486266895158dd8f369f7e5ca46021a07604304b54f3f"} Apr 21 15:12:40.660797 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.660747 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:12:40.668263 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.668220 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lnq2b" podStartSLOduration=1.668204377 podStartE2EDuration="1.668204377s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:12:40.667775088 +0000 UTC m=+125.074183352" watchObservedRunningTime="2026-04-21 15:12:40.668204377 +0000 UTC m=+125.074612637" Apr 21 15:12:40.860841 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.860810 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 15:12:40.868952 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:40.868922 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2e49b4d-d05c-4693-9ac9-190627c56f55-serving-cert\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:40.876473 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.876441 2544 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.876595 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.876513 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-trusted-ca-bundle podName:f2e49b4d-d05c-4693-9ac9-190627c56f55 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.376496791 +0000 UTC m=+125.782905028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-trusted-ca-bundle") pod "insights-operator-585dfdc468-n8zqd" (UID: "f2e49b4d-d05c-4693-9ac9-190627c56f55") : failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.876595 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.876453 2544 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.876669 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.876599 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-service-ca-bundle podName:f2e49b4d-d05c-4693-9ac9-190627c56f55 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.376581749 +0000 UTC m=+125.782989991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-service-ca-bundle") pod "insights-operator-585dfdc468-n8zqd" (UID: "f2e49b4d-d05c-4693-9ac9-190627c56f55") : failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.902855 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.902815 2544 projected.go:289] Couldn't get configMap openshift-insights/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.902855 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.902860 2544 projected.go:194] Error preparing data for projected volume kube-api-access-t2jfd for pod openshift-insights/insights-operator-585dfdc468-n8zqd: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.903069 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:40.902933 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2e49b4d-d05c-4693-9ac9-190627c56f55-kube-api-access-t2jfd podName:f2e49b4d-d05c-4693-9ac9-190627c56f55 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.402915263 +0000 UTC m=+125.809323506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t2jfd" (UniqueName: "kubernetes.io/projected/f2e49b4d-d05c-4693-9ac9-190627c56f55-kube-api-access-t2jfd") pod "insights-operator-585dfdc468-n8zqd" (UID: "f2e49b4d-d05c-4693-9ac9-190627c56f55") : failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:41.121272 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.121188 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-x76cb\"" Apr 21 15:12:41.186972 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.186931 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 15:12:41.270118 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.270083 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 15:12:41.283834 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.283805 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:12:41.290134 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.290102 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:41.290273 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.290178 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:41.290315 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.290267 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:43.290248871 +0000 UTC m=+127.696657127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : configmap references non-existent config key: service-ca.crt Apr 21 15:12:41.290363 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.290309 2544 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:12:41.290363 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.290357 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:43.290345077 +0000 UTC m=+127.696753315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : secret "router-metrics-certs-default" not found Apr 21 15:12:41.391609 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.391514 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-service-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.391785 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.391683 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.392280 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.392256 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-service-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.392593 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.392570 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2e49b4d-d05c-4693-9ac9-190627c56f55-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.493016 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.492976 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jfd\" (UniqueName: \"kubernetes.io/projected/f2e49b4d-d05c-4693-9ac9-190627c56f55-kube-api-access-t2jfd\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.493227 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.493031 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:41.493227 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.493140 2544 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:12:41.493227 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.493150 2544 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d985b498b-8lcdx: secret "image-registry-tls" not found Apr 21 15:12:41.493227 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.493201 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls podName:f4e6e922-a37d-437a-ac4e-43b6b4333c85 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:43.493180953 +0000 UTC m=+127.899589194 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls") pod "image-registry-6d985b498b-8lcdx" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85") : secret "image-registry-tls" not found Apr 21 15:12:41.495529 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.495500 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jfd\" (UniqueName: \"kubernetes.io/projected/f2e49b4d-d05c-4693-9ac9-190627c56f55-kube-api-access-t2jfd\") pod \"insights-operator-585dfdc468-n8zqd\" (UID: \"f2e49b4d-d05c-4693-9ac9-190627c56f55\") " pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.593898 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.593861 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:41.594273 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.594003 2544 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:12:41.594273 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:41.594066 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls podName:eb57911b-462f-4738-9ef6-dcc01747de8f nodeName:}" failed. No retries permitted until 2026-04-21 15:12:43.594052261 +0000 UTC m=+128.000460502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6q7s" (UID: "eb57911b-462f-4738-9ef6-dcc01747de8f") : secret "samples-operator-tls" not found Apr 21 15:12:41.785615 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.785513 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" Apr 21 15:12:41.906226 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:41.906191 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-n8zqd"] Apr 21 15:12:41.909015 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:12:41.908978 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e49b4d_d05c_4693_9ac9_190627c56f55.slice/crio-1846776ecafd53379f91dd9552e7523b3139b30f7731ecbd68aa30bebd08b6ed WatchSource:0}: Error finding container 1846776ecafd53379f91dd9552e7523b3139b30f7731ecbd68aa30bebd08b6ed: Status 404 returned error can't find the container with id 1846776ecafd53379f91dd9552e7523b3139b30f7731ecbd68aa30bebd08b6ed Apr 21 15:12:42.648543 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:42.648239 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" event={"ID":"f2e49b4d-d05c-4693-9ac9-190627c56f55","Type":"ContainerStarted","Data":"1846776ecafd53379f91dd9552e7523b3139b30f7731ecbd68aa30bebd08b6ed"} Apr 21 15:12:43.309875 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:43.309831 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:43.310059 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:43.309897 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:43.310059 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.310005 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:47.309988068 +0000 UTC m=+131.716396313 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : configmap references non-existent config key: service-ca.crt Apr 21 15:12:43.310059 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.310011 2544 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:12:43.310223 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.310101 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:47.310082049 +0000 UTC m=+131.716490287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : secret "router-metrics-certs-default" not found Apr 21 15:12:43.511814 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:43.511776 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:43.512001 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.511945 2544 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:12:43.512001 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.511966 2544 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d985b498b-8lcdx: secret "image-registry-tls" not found Apr 21 15:12:43.512119 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.512039 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls podName:f4e6e922-a37d-437a-ac4e-43b6b4333c85 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:47.51202215 +0000 UTC m=+131.918430388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls") pod "image-registry-6d985b498b-8lcdx" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85") : secret "image-registry-tls" not found Apr 21 15:12:43.612993 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:43.612903 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:43.613167 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.613067 2544 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:12:43.613167 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:43.613130 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls podName:eb57911b-462f-4738-9ef6-dcc01747de8f nodeName:}" failed. No retries permitted until 2026-04-21 15:12:47.613114968 +0000 UTC m=+132.019523206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6q7s" (UID: "eb57911b-462f-4738-9ef6-dcc01747de8f") : secret "samples-operator-tls" not found Apr 21 15:12:44.456386 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.456342 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf"] Apr 21 15:12:44.459576 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.459554 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" Apr 21 15:12:44.466724 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.466696 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 15:12:44.468011 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.467996 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 15:12:44.468070 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.467997 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tgcpn\"" Apr 21 15:12:44.482377 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.482343 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf"] Apr 21 15:12:44.520247 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.520209 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bdlz\" (UniqueName: \"kubernetes.io/projected/d450b0ff-67e3-4fe9-946f-9320d97a5efd-kube-api-access-8bdlz\") pod \"migrator-74bb7799d9-96kdf\" (UID: \"d450b0ff-67e3-4fe9-946f-9320d97a5efd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" Apr 21 15:12:44.621641 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.621609 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bdlz\" (UniqueName: \"kubernetes.io/projected/d450b0ff-67e3-4fe9-946f-9320d97a5efd-kube-api-access-8bdlz\") pod \"migrator-74bb7799d9-96kdf\" (UID: \"d450b0ff-67e3-4fe9-946f-9320d97a5efd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" Apr 21 15:12:44.648345 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.648311 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bdlz\" (UniqueName: \"kubernetes.io/projected/d450b0ff-67e3-4fe9-946f-9320d97a5efd-kube-api-access-8bdlz\") pod \"migrator-74bb7799d9-96kdf\" (UID: \"d450b0ff-67e3-4fe9-946f-9320d97a5efd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" Apr 21 15:12:44.654372 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.654333 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" event={"ID":"f2e49b4d-d05c-4693-9ac9-190627c56f55","Type":"ContainerStarted","Data":"3fc6a4e059e9a531a438f55d4f68ba89e43c66024c8f7e129903f5bc50d9bac6"} Apr 21 15:12:44.694421 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.694363 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" podStartSLOduration=3.635160511 podStartE2EDuration="5.694346001s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:12:41.910796312 +0000 UTC m=+126.317204550" lastFinishedPulling="2026-04-21 15:12:43.969981801 +0000 UTC m=+128.376390040" observedRunningTime="2026-04-21 15:12:44.692738265 +0000 UTC m=+129.099146524" watchObservedRunningTime="2026-04-21 15:12:44.694346001 +0000 UTC m=+129.100754264" Apr 21 15:12:44.768119 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.768016 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" Apr 21 15:12:44.911078 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.911050 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf"] Apr 21 15:12:44.913814 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:12:44.913786 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd450b0ff_67e3_4fe9_946f_9320d97a5efd.slice/crio-3388fe3ba99705be3a41de08833fd36327999db0d8c1c9eb40e2577f0c74b2cd WatchSource:0}: Error finding container 3388fe3ba99705be3a41de08833fd36327999db0d8c1c9eb40e2577f0c74b2cd: Status 404 returned error can't find the container with id 3388fe3ba99705be3a41de08833fd36327999db0d8c1c9eb40e2577f0c74b2cd Apr 21 15:12:44.923959 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:44.923935 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:12:44.924083 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:44.924065 2544 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:12:44.924145 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:44.924135 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs podName:3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428 nodeName:}" failed. No retries permitted until 2026-04-21 15:14:46.924114905 +0000 UTC m=+251.330523162 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs") pod "network-metrics-daemon-84hkv" (UID: "3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428") : secret "metrics-daemon-secret" not found Apr 21 15:12:45.658139 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:45.658090 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" event={"ID":"d450b0ff-67e3-4fe9-946f-9320d97a5efd","Type":"ContainerStarted","Data":"3388fe3ba99705be3a41de08833fd36327999db0d8c1c9eb40e2577f0c74b2cd"} Apr 21 15:12:46.665251 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:46.665217 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" event={"ID":"d450b0ff-67e3-4fe9-946f-9320d97a5efd","Type":"ContainerStarted","Data":"227b14247448eae9cdea51a42dc3d1ea3f89c13378f6e2d03e074a2cdb0c15ae"} Apr 21 15:12:46.665251 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:46.665255 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" event={"ID":"d450b0ff-67e3-4fe9-946f-9320d97a5efd","Type":"ContainerStarted","Data":"6f5a041d89153e346041b3755f1a582436d04f9ff46ffc731bdbb352fc6cf222"} Apr 21 15:12:46.687743 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:46.687695 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-96kdf" podStartSLOduration=1.552718485 podStartE2EDuration="2.687680517s" podCreationTimestamp="2026-04-21 15:12:44 +0000 UTC" firstStartedPulling="2026-04-21 15:12:44.916061463 +0000 UTC m=+129.322469702" lastFinishedPulling="2026-04-21 15:12:46.051023493 +0000 UTC m=+130.457431734" observedRunningTime="2026-04-21 15:12:46.686029835 +0000 UTC m=+131.092438096" watchObservedRunningTime="2026-04-21 15:12:46.687680517 +0000 UTC m=+131.094088774" Apr 21 15:12:47.342840 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:47.342795 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:47.343019 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:47.342851 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:47.343019 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.342932 2544 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 15:12:47.343019 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.342972 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:55.342952011 +0000 UTC m=+139.749360249 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : configmap references non-existent config key: service-ca.crt Apr 21 15:12:47.343131 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.343040 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:55.343019592 +0000 UTC m=+139.749427833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : secret "router-metrics-certs-default" not found Apr 21 15:12:47.544812 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:47.544766 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:47.545018 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.544940 2544 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:12:47.545018 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.544967 2544 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d985b498b-8lcdx: secret "image-registry-tls" not found Apr 21 15:12:47.545139 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.545033 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls podName:f4e6e922-a37d-437a-ac4e-43b6b4333c85 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:55.545012374 +0000 UTC m=+139.951420612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls") pod "image-registry-6d985b498b-8lcdx" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85") : secret "image-registry-tls" not found Apr 21 15:12:47.646130 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:47.646037 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:47.646305 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.646156 2544 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 15:12:47.646305 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:47.646220 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls podName:eb57911b-462f-4738-9ef6-dcc01747de8f nodeName:}" failed. No retries permitted until 2026-04-21 15:12:55.646205665 +0000 UTC m=+140.052613903 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-v6q7s" (UID: "eb57911b-462f-4738-9ef6-dcc01747de8f") : secret "samples-operator-tls" not found Apr 21 15:12:48.273516 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:48.273489 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tbnxq_c5594cf9-4b09-4077-9a0a-c6e1e4145792/dns-node-resolver/0.log" Apr 21 15:12:49.071355 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:49.071329 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9lsjs_c23d088b-b54c-4874-aac7-248c3a09117a/node-ca/0.log" Apr 21 15:12:50.275670 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:50.275638 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-96kdf_d450b0ff-67e3-4fe9-946f-9320d97a5efd/migrator/0.log" Apr 21 15:12:50.470552 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:50.470522 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-96kdf_d450b0ff-67e3-4fe9-946f-9320d97a5efd/graceful-termination/0.log" Apr 21 15:12:55.410352 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.410292 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:55.410352 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.410369 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:55.410822 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:12:55.410506 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle podName:75433e2f-ffe1-4177-857f-37e16ba4a802 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:11.410478503 +0000 UTC m=+155.816886742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle") pod "router-default-79765d5944-b66kf" (UID: "75433e2f-ffe1-4177-857f-37e16ba4a802") : configmap references non-existent config key: service-ca.crt Apr 21 15:12:55.412714 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.412691 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75433e2f-ffe1-4177-857f-37e16ba4a802-metrics-certs\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:12:55.612905 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.612864 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:55.615255 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.615225 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"image-registry-6d985b498b-8lcdx\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:55.671909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.671820 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:55.713483 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.713446 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:55.715927 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.715903 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/eb57911b-462f-4738-9ef6-dcc01747de8f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-v6q7s\" (UID: \"eb57911b-462f-4738-9ef6-dcc01747de8f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:55.760003 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.759968 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" Apr 21 15:12:55.822155 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.822122 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d985b498b-8lcdx"] Apr 21 15:12:55.823729 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:12:55.823698 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e6e922_a37d_437a_ac4e_43b6b4333c85.slice/crio-8a81678837c51cd7d75cfd640087c76f1d9da3152c1159b6e8b2c45c050fa292 WatchSource:0}: Error finding container 8a81678837c51cd7d75cfd640087c76f1d9da3152c1159b6e8b2c45c050fa292: Status 404 returned error can't find the container with id 8a81678837c51cd7d75cfd640087c76f1d9da3152c1159b6e8b2c45c050fa292 Apr 21 15:12:55.914078 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:55.913960 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s"] Apr 21 15:12:56.693097 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:56.693049 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" event={"ID":"eb57911b-462f-4738-9ef6-dcc01747de8f","Type":"ContainerStarted","Data":"e0b484c7ea6766dde49c28ffa10e25fd170cad20665e6a08ecec7318b08b3059"} Apr 21 15:12:56.694663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:56.694627 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" event={"ID":"f4e6e922-a37d-437a-ac4e-43b6b4333c85","Type":"ContainerStarted","Data":"28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639"} Apr 21 15:12:56.694663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:56.694666 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" event={"ID":"f4e6e922-a37d-437a-ac4e-43b6b4333c85","Type":"ContainerStarted","Data":"8a81678837c51cd7d75cfd640087c76f1d9da3152c1159b6e8b2c45c050fa292"} Apr 21 15:12:56.694888 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:56.694801 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:12:56.717054 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:56.716907 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" podStartSLOduration=17.716888207 podStartE2EDuration="17.716888207s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:12:56.716233278 +0000 UTC m=+141.122641539" watchObservedRunningTime="2026-04-21 15:12:56.716888207 +0000 UTC m=+141.123296469" Apr 21 15:12:57.698228 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:57.698192 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" event={"ID":"eb57911b-462f-4738-9ef6-dcc01747de8f","Type":"ContainerStarted","Data":"1b9826020923ba69c3f6d46c4ecd31ef2701c9eef623825b594a5ca4742bdfc5"} Apr 21 15:12:58.701829 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:58.701790 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" event={"ID":"eb57911b-462f-4738-9ef6-dcc01747de8f","Type":"ContainerStarted","Data":"75e999074e493ba25d44ea230b0c16353001ea4473e6ad937d6aba9e64827994"} Apr 21 15:12:58.725527 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:12:58.725472 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-v6q7s" podStartSLOduration=18.098863902 podStartE2EDuration="19.725455902s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:12:55.952991244 +0000 UTC m=+140.359399481" lastFinishedPulling="2026-04-21 15:12:57.579583229 +0000 UTC m=+141.985991481" observedRunningTime="2026-04-21 15:12:58.724082373 +0000 UTC m=+143.130490632" watchObservedRunningTime="2026-04-21 15:12:58.725455902 +0000 UTC m=+143.131864160" Apr 21 15:13:11.443495 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:11.443460 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:11.444052 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:11.444033 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75433e2f-ffe1-4177-857f-37e16ba4a802-service-ca-bundle\") pod \"router-default-79765d5944-b66kf\" (UID: \"75433e2f-ffe1-4177-857f-37e16ba4a802\") " pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:11.467597 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:11.467557 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dwdcj" podUID="18eaf2c1-6533-4b15-a759-c0e039abbd8f" Apr 21 15:13:11.480742 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:11.480711 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-q5k4h" podUID="a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059" Apr 21 15:13:11.678765 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:11.678724 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:11.732088 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:11.731820 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dwdcj" Apr 21 15:13:11.802184 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:11.802151 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79765d5944-b66kf"] Apr 21 15:13:11.805666 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:11.805634 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75433e2f_ffe1_4177_857f_37e16ba4a802.slice/crio-5c29f2568a6f876a0c7be9325f075e25dedfaf7b397fadb55c38fafd94bfe8b0 WatchSource:0}: Error finding container 5c29f2568a6f876a0c7be9325f075e25dedfaf7b397fadb55c38fafd94bfe8b0: Status 404 returned error can't find the container with id 5c29f2568a6f876a0c7be9325f075e25dedfaf7b397fadb55c38fafd94bfe8b0 Apr 21 15:13:12.735177 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.735136 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79765d5944-b66kf" event={"ID":"75433e2f-ffe1-4177-857f-37e16ba4a802","Type":"ContainerStarted","Data":"f31277930fc7c83e5e57368555adbb290783acfbd744eb92e5c9432462ce7272"} Apr 21 15:13:12.735177 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.735172 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79765d5944-b66kf" event={"ID":"75433e2f-ffe1-4177-857f-37e16ba4a802","Type":"ContainerStarted","Data":"5c29f2568a6f876a0c7be9325f075e25dedfaf7b397fadb55c38fafd94bfe8b0"} Apr 21 15:13:12.818939 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.818889 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79765d5944-b66kf" podStartSLOduration=33.818874448 podStartE2EDuration="33.818874448s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:13:12.765515768 +0000 UTC m=+157.171924028" watchObservedRunningTime="2026-04-21 15:13:12.818874448 +0000 UTC m=+157.225282731" Apr 21 15:13:12.819380 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.819363 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-98544"] Apr 21 15:13:12.822675 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.822658 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:12.827322 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.827294 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5pq58\"" Apr 21 15:13:12.827466 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.827298 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:13:12.827466 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.827344 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:13:12.842183 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.842149 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-98544"] Apr 21 15:13:12.878287 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.878239 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d985b498b-8lcdx"] Apr 21 15:13:12.882259 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.882228 2544 patch_prober.go:28] interesting pod/image-registry-6d985b498b-8lcdx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:13:12.882397 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.882277 2544 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:13:12.954233 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.954199 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5e616a4d-2a3a-42c3-be82-9a795c5fa152-crio-socket\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:12.954233 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.954236 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5e616a4d-2a3a-42c3-be82-9a795c5fa152-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:12.954431 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.954260 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5e616a4d-2a3a-42c3-be82-9a795c5fa152-data-volume\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:12.954431 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.954318 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8jl8\" (UniqueName: \"kubernetes.io/projected/5e616a4d-2a3a-42c3-be82-9a795c5fa152-kube-api-access-l8jl8\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:12.954431 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:12.954390 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5e616a4d-2a3a-42c3-be82-9a795c5fa152-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.055505 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055473 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5e616a4d-2a3a-42c3-be82-9a795c5fa152-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.055666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055565 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5e616a4d-2a3a-42c3-be82-9a795c5fa152-crio-socket\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.055666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055584 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5e616a4d-2a3a-42c3-be82-9a795c5fa152-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.055666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055610 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5e616a4d-2a3a-42c3-be82-9a795c5fa152-data-volume\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.055666 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055627 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8jl8\" (UniqueName: \"kubernetes.io/projected/5e616a4d-2a3a-42c3-be82-9a795c5fa152-kube-api-access-l8jl8\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.055849 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055771 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5e616a4d-2a3a-42c3-be82-9a795c5fa152-crio-socket\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.056011 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.055993 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5e616a4d-2a3a-42c3-be82-9a795c5fa152-data-volume\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.056077 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.056059 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5e616a4d-2a3a-42c3-be82-9a795c5fa152-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.057904 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.057885 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5e616a4d-2a3a-42c3-be82-9a795c5fa152-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.073440 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.073412 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8jl8\" (UniqueName: \"kubernetes.io/projected/5e616a4d-2a3a-42c3-be82-9a795c5fa152-kube-api-access-l8jl8\") pod \"insights-runtime-extractor-98544\" (UID: \"5e616a4d-2a3a-42c3-be82-9a795c5fa152\") " pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.132186 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.132150 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-98544" Apr 21 15:13:13.217650 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:13.217599 2544 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-84hkv" podUID="3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428" Apr 21 15:13:13.256899 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.256867 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-98544"] Apr 21 15:13:13.259851 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:13.259820 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e616a4d_2a3a_42c3_be82_9a795c5fa152.slice/crio-b0a08ab4aa8c8a8a3ffbf888b1961597e14297af142e7a5de168aa278fff0d5e WatchSource:0}: Error finding container b0a08ab4aa8c8a8a3ffbf888b1961597e14297af142e7a5de168aa278fff0d5e: Status 404 returned error can't find the container with id b0a08ab4aa8c8a8a3ffbf888b1961597e14297af142e7a5de168aa278fff0d5e Apr 21 15:13:13.679133 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.679046 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:13.681504 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.681477 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:13.738453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.738415 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-98544" event={"ID":"5e616a4d-2a3a-42c3-be82-9a795c5fa152","Type":"ContainerStarted","Data":"c749c516210f4465de5573d9bbb27ad743509b7822c04522fe8b565b19f90f6b"} Apr 21 15:13:13.738453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.738451 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-98544" event={"ID":"5e616a4d-2a3a-42c3-be82-9a795c5fa152","Type":"ContainerStarted","Data":"b0a08ab4aa8c8a8a3ffbf888b1961597e14297af142e7a5de168aa278fff0d5e"} Apr 21 15:13:13.739640 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.739613 2544 generic.go:358] "Generic (PLEG): container finished" podID="57a556a8-7dd1-4d6a-bba8-4b1896f9915c" containerID="c192e00927820c4b399efb83b447cad578e9b17a91483f4d899b4b31f10afb22" exitCode=255 Apr 21 15:13:13.739776 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.739681 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" event={"ID":"57a556a8-7dd1-4d6a-bba8-4b1896f9915c","Type":"ContainerDied","Data":"c192e00927820c4b399efb83b447cad578e9b17a91483f4d899b4b31f10afb22"} Apr 21 15:13:13.739975 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.739954 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:13.741181 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.741162 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79765d5944-b66kf" Apr 21 15:13:13.745966 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:13.745947 2544 scope.go:117] "RemoveContainer" containerID="c192e00927820c4b399efb83b447cad578e9b17a91483f4d899b4b31f10afb22" Apr 21 15:13:14.744698 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:14.744647 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5fd7bb9459-79c7k" event={"ID":"57a556a8-7dd1-4d6a-bba8-4b1896f9915c","Type":"ContainerStarted","Data":"387fa76338ea39bfdb965822d8480ffa97074be0b01a28f960bd4d92f871a96f"} Apr 21 15:13:14.746502 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:14.746475 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-98544" event={"ID":"5e616a4d-2a3a-42c3-be82-9a795c5fa152","Type":"ContainerStarted","Data":"5baf366abd9a5974b210489dd1a9063fe07487d6621d9e9d80d69d9e12ebc8fc"} Apr 21 15:13:15.750954 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:15.750916 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-98544" event={"ID":"5e616a4d-2a3a-42c3-be82-9a795c5fa152","Type":"ContainerStarted","Data":"61e85a6d79fa32025a970070abf13d71a20aa31715108e609a31d67501b8673c"} Apr 21 15:13:15.780797 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:15.780732 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-98544" podStartSLOduration=1.498452188 podStartE2EDuration="3.780716796s" podCreationTimestamp="2026-04-21 15:13:12 +0000 UTC" firstStartedPulling="2026-04-21 15:13:13.313191285 +0000 UTC m=+157.719599523" lastFinishedPulling="2026-04-21 15:13:15.595455884 +0000 UTC m=+160.001864131" observedRunningTime="2026-04-21 15:13:15.77946294 +0000 UTC m=+160.185871204" watchObservedRunningTime="2026-04-21 15:13:15.780716796 +0000 UTC m=+160.187125056" Apr 21 15:13:16.388063 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.388015 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:13:16.388246 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.388081 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:13:16.390493 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.390458 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059-cert\") pod \"ingress-canary-q5k4h\" (UID: \"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059\") " pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:13:16.390493 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.390484 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18eaf2c1-6533-4b15-a759-c0e039abbd8f-metrics-tls\") pod \"dns-default-dwdcj\" (UID: \"18eaf2c1-6533-4b15-a759-c0e039abbd8f\") " pod="openshift-dns/dns-default-dwdcj" Apr 21 15:13:16.543536 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.543498 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-76666\"" Apr 21 15:13:16.554185 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.554133 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dwdcj" Apr 21 15:13:16.697982 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.697906 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dwdcj"] Apr 21 15:13:16.701010 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:16.700976 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18eaf2c1_6533_4b15_a759_c0e039abbd8f.slice/crio-d3b0baf7534b5f103d793a0d5f2d071ccb4fe975c7823f41d8ead3d1f8d07439 WatchSource:0}: Error finding container d3b0baf7534b5f103d793a0d5f2d071ccb4fe975c7823f41d8ead3d1f8d07439: Status 404 returned error can't find the container with id d3b0baf7534b5f103d793a0d5f2d071ccb4fe975c7823f41d8ead3d1f8d07439 Apr 21 15:13:16.754051 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:16.754008 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dwdcj" event={"ID":"18eaf2c1-6533-4b15-a759-c0e039abbd8f","Type":"ContainerStarted","Data":"d3b0baf7534b5f103d793a0d5f2d071ccb4fe975c7823f41d8ead3d1f8d07439"} Apr 21 15:13:18.761087 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:18.761043 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dwdcj" event={"ID":"18eaf2c1-6533-4b15-a759-c0e039abbd8f","Type":"ContainerStarted","Data":"4056757149f76e12364f6769d2e42483cf626fa28e35d4a023e0a790a1fe00c7"} Apr 21 15:13:18.761087 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:18.761088 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dwdcj" event={"ID":"18eaf2c1-6533-4b15-a759-c0e039abbd8f","Type":"ContainerStarted","Data":"7063def8ce85cef30321097424cafb1c1ba98a151559a4aa6afbf30a92c586f5"} Apr 21 15:13:18.761567 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:18.761196 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dwdcj" Apr 21 15:13:18.781531 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:18.781424 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dwdcj" podStartSLOduration=129.596602292 podStartE2EDuration="2m10.781406066s" podCreationTimestamp="2026-04-21 15:11:08 +0000 UTC" firstStartedPulling="2026-04-21 15:13:16.702918401 +0000 UTC m=+161.109326639" lastFinishedPulling="2026-04-21 15:13:17.887722175 +0000 UTC m=+162.294130413" observedRunningTime="2026-04-21 15:13:18.781128013 +0000 UTC m=+163.187536272" watchObservedRunningTime="2026-04-21 15:13:18.781406066 +0000 UTC m=+163.187814323" Apr 21 15:13:20.341646 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.341610 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77b56f58d-jkshr"] Apr 21 15:13:20.344909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.344886 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.354073 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.354052 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 15:13:20.354157 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.354089 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 15:13:20.354198 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.354089 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 15:13:20.355740 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.355717 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 15:13:20.356250 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.356233 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 15:13:20.356443 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.356431 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 15:13:20.357387 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.357363 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j2gn9\"" Apr 21 15:13:20.357493 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.357475 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 15:13:20.362116 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.362093 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 15:13:20.364050 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.364024 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b56f58d-jkshr"] Apr 21 15:13:20.420804 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420775 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-serving-cert\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.420986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420820 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-config\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.420986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420848 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzvf\" (UniqueName: \"kubernetes.io/projected/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-kube-api-access-ktzvf\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.420986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420868 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-trusted-ca-bundle\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.420986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420900 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-service-ca\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.420986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420937 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-oauth-serving-cert\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.420986 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.420957 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-oauth-config\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521527 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521481 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-oauth-config\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521684 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521549 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-serving-cert\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521684 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521586 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-config\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521684 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521627 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzvf\" (UniqueName: \"kubernetes.io/projected/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-kube-api-access-ktzvf\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521684 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521657 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-trusted-ca-bundle\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521684 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521681 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-service-ca\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.521992 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.521731 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-oauth-serving-cert\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.522557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.522526 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-service-ca\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.522557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.522546 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-oauth-serving-cert\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.522698 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.522577 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-config\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.522698 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.522678 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-trusted-ca-bundle\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.524186 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.524167 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-serving-cert\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.524274 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.524223 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-oauth-config\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.531004 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.530975 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzvf\" (UniqueName: \"kubernetes.io/projected/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-kube-api-access-ktzvf\") pod \"console-77b56f58d-jkshr\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.654028 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.653939 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:20.774541 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:20.774471 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77b56f58d-jkshr"] Apr 21 15:13:20.778154 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:20.778127 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod197819d1_2ff2_4a0d_a8b1_2a74d85a053e.slice/crio-c6ef084a465ff8924ff845b1b429905c4be6342a4d2b3c296d8c009601c4949a WatchSource:0}: Error finding container c6ef084a465ff8924ff845b1b429905c4be6342a4d2b3c296d8c009601c4949a: Status 404 returned error can't find the container with id c6ef084a465ff8924ff845b1b429905c4be6342a4d2b3c296d8c009601c4949a Apr 21 15:13:21.770928 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:21.770864 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b56f58d-jkshr" event={"ID":"197819d1-2ff2-4a0d-a8b1-2a74d85a053e","Type":"ContainerStarted","Data":"c6ef084a465ff8924ff845b1b429905c4be6342a4d2b3c296d8c009601c4949a"} Apr 21 15:13:22.883867 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:22.883817 2544 patch_prober.go:28] interesting pod/image-registry-6d985b498b-8lcdx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:13:22.884338 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:22.883886 2544 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:13:23.778197 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:23.778111 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b56f58d-jkshr" event={"ID":"197819d1-2ff2-4a0d-a8b1-2a74d85a053e","Type":"ContainerStarted","Data":"3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9"} Apr 21 15:13:23.799615 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:23.799557 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77b56f58d-jkshr" podStartSLOduration=1.181089783 podStartE2EDuration="3.799543081s" podCreationTimestamp="2026-04-21 15:13:20 +0000 UTC" firstStartedPulling="2026-04-21 15:13:20.779847548 +0000 UTC m=+165.186255786" lastFinishedPulling="2026-04-21 15:13:23.398300843 +0000 UTC m=+167.804709084" observedRunningTime="2026-04-21 15:13:23.798784878 +0000 UTC m=+168.205193138" watchObservedRunningTime="2026-04-21 15:13:23.799543081 +0000 UTC m=+168.205951341" Apr 21 15:13:25.202474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:25.202398 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:13:25.202474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:25.202438 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:13:25.205959 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:25.205937 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l7xgv\"" Apr 21 15:13:25.213360 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:25.213342 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5k4h" Apr 21 15:13:25.336262 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:25.336150 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q5k4h"] Apr 21 15:13:25.338666 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:25.338636 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f8e763_6577_4e5f_bc6c_5d7ac9e2c059.slice/crio-c149d442056819d1bda98265284c716edfa908b5b00be079404dd918b5186b54 WatchSource:0}: Error finding container c149d442056819d1bda98265284c716edfa908b5b00be079404dd918b5186b54: Status 404 returned error can't find the container with id c149d442056819d1bda98265284c716edfa908b5b00be079404dd918b5186b54 Apr 21 15:13:25.785356 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:25.785314 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q5k4h" event={"ID":"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059","Type":"ContainerStarted","Data":"c149d442056819d1bda98265284c716edfa908b5b00be079404dd918b5186b54"} Apr 21 15:13:27.792702 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:27.792661 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q5k4h" event={"ID":"a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059","Type":"ContainerStarted","Data":"e1e3d271a3211926f69d20db5e84f6f62686f64f291ef496af2447a43ca9cbcb"} Apr 21 15:13:27.811445 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:27.811392 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q5k4h" podStartSLOduration=138.283429781 podStartE2EDuration="2m19.811376435s" podCreationTimestamp="2026-04-21 15:11:08 +0000 UTC" firstStartedPulling="2026-04-21 15:13:25.340649923 +0000 UTC m=+169.747058161" lastFinishedPulling="2026-04-21 15:13:26.868596576 +0000 UTC m=+171.275004815" observedRunningTime="2026-04-21 15:13:27.810398704 +0000 UTC m=+172.216806963" watchObservedRunningTime="2026-04-21 15:13:27.811376435 +0000 UTC m=+172.217784695" Apr 21 15:13:28.766328 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:28.766292 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dwdcj" Apr 21 15:13:30.654489 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:30.654445 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:30.655179 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:30.654502 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:30.659471 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:30.659444 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:30.803864 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:30.803831 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:13:32.882512 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:32.882475 2544 patch_prober.go:28] interesting pod/image-registry-6d985b498b-8lcdx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:13:32.882935 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:32.882587 2544 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:13:33.215552 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.215454 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xcldm"] Apr 21 15:13:33.218765 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.218724 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.221502 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.221482 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:13:33.222075 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.222062 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:13:33.222389 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.222371 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qhrt9\"" Apr 21 15:13:33.222461 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.222401 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:13:33.223261 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.223245 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:13:33.223415 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.223401 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:13:33.223634 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.223617 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:13:33.230208 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230186 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230307 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230219 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-sys\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230307 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230287 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-metrics-client-ca\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230419 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230314 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzd8\" (UniqueName: \"kubernetes.io/projected/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-kube-api-access-pnzd8\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230419 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230341 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230419 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230377 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-textfile\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230561 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230422 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-root\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230561 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230442 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-wtmp\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.230561 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.230480 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-tls\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.330835 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330798 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.330835 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330837 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-sys\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331049 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330867 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-metrics-client-ca\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331049 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330886 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzd8\" (UniqueName: \"kubernetes.io/projected/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-kube-api-access-pnzd8\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331049 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330909 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331049 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330993 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-sys\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331049 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.330995 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-textfile\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331267 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331077 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-root\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331267 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331109 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-wtmp\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331267 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331159 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-tls\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331267 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331222 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-root\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331421 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331283 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-wtmp\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331421 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:33.331302 2544 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:13:33.331421 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:33.331375 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-tls podName:2b8e98d1-29a2-44eb-9981-0c9d473a51aa nodeName:}" failed. No retries permitted until 2026-04-21 15:13:33.831355738 +0000 UTC m=+178.237763975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-tls") pod "node-exporter-xcldm" (UID: "2b8e98d1-29a2-44eb-9981-0c9d473a51aa") : secret "node-exporter-tls" not found Apr 21 15:13:33.331421 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331386 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-textfile\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331625 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331609 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-accelerators-collector-config\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.331665 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.331623 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-metrics-client-ca\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.333383 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.333366 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.344939 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.344907 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzd8\" (UniqueName: \"kubernetes.io/projected/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-kube-api-access-pnzd8\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.834732 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.834687 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-tls\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:33.837034 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:33.837006 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b8e98d1-29a2-44eb-9981-0c9d473a51aa-node-exporter-tls\") pod \"node-exporter-xcldm\" (UID: \"2b8e98d1-29a2-44eb-9981-0c9d473a51aa\") " pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:34.127503 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:34.127406 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xcldm" Apr 21 15:13:34.136268 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:34.136225 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8e98d1_29a2_44eb_9981_0c9d473a51aa.slice/crio-7e42a71e98292f687b5aa25fbb12797128275b70c79d65e826b81dd4d0b8e850 WatchSource:0}: Error finding container 7e42a71e98292f687b5aa25fbb12797128275b70c79d65e826b81dd4d0b8e850: Status 404 returned error can't find the container with id 7e42a71e98292f687b5aa25fbb12797128275b70c79d65e826b81dd4d0b8e850 Apr 21 15:13:34.811884 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:34.811844 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xcldm" event={"ID":"2b8e98d1-29a2-44eb-9981-0c9d473a51aa","Type":"ContainerStarted","Data":"7e42a71e98292f687b5aa25fbb12797128275b70c79d65e826b81dd4d0b8e850"} Apr 21 15:13:35.816377 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:35.816342 2544 generic.go:358] "Generic (PLEG): container finished" podID="2b8e98d1-29a2-44eb-9981-0c9d473a51aa" containerID="9d2e17b0308f12c403049391ec3348db7b98e4a434adac170d704dd6160056fc" exitCode=0 Apr 21 15:13:35.816736 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:35.816382 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xcldm" event={"ID":"2b8e98d1-29a2-44eb-9981-0c9d473a51aa","Type":"ContainerDied","Data":"9d2e17b0308f12c403049391ec3348db7b98e4a434adac170d704dd6160056fc"} Apr 21 15:13:36.165258 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.165166 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-745cb96956-h6vct"] Apr 21 15:13:36.167649 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.167626 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.170607 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.170582 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 15:13:36.170911 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.170663 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-j2pg8\"" Apr 21 15:13:36.170911 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.170703 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 15:13:36.171060 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.170922 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 15:13:36.171125 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.171082 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-boueviq4kbvm5\"" Apr 21 15:13:36.171181 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.171122 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 15:13:36.171883 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.171867 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 15:13:36.182766 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.182729 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-745cb96956-h6vct"] Apr 21 15:13:36.251597 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251562 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-grpc-tls\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.251597 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251600 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99fb4416-714e-46d0-81ca-059cc9d4c30a-metrics-client-ca\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.251866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251634 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlc5z\" (UniqueName: \"kubernetes.io/projected/99fb4416-714e-46d0-81ca-059cc9d4c30a-kube-api-access-xlc5z\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.251866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251671 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.251866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251702 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.251866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251741 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.251866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251841 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.252018 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.251869 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-tls\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352431 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352395 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352444 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352500 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352520 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-tls\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352543 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-grpc-tls\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352562 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99fb4416-714e-46d0-81ca-059cc9d4c30a-metrics-client-ca\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352593 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlc5z\" (UniqueName: \"kubernetes.io/projected/99fb4416-714e-46d0-81ca-059cc9d4c30a-kube-api-access-xlc5z\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.352937 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.352619 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.353831 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.353742 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99fb4416-714e-46d0-81ca-059cc9d4c30a-metrics-client-ca\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.355610 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.355581 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.355722 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.355637 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.355982 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.355962 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-tls\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.356045 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.356017 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-grpc-tls\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.356124 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.356104 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.356159 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.356115 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/99fb4416-714e-46d0-81ca-059cc9d4c30a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.364854 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.364817 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlc5z\" (UniqueName: \"kubernetes.io/projected/99fb4416-714e-46d0-81ca-059cc9d4c30a-kube-api-access-xlc5z\") pod \"thanos-querier-745cb96956-h6vct\" (UID: \"99fb4416-714e-46d0-81ca-059cc9d4c30a\") " pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.476383 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.476292 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:36.618117 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.618079 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-745cb96956-h6vct"] Apr 21 15:13:36.621066 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:36.621039 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99fb4416_714e_46d0_81ca_059cc9d4c30a.slice/crio-a2175661aac9f3f3656685c145fc6b668c9f68af495413b55f03f2d57ffe4368 WatchSource:0}: Error finding container a2175661aac9f3f3656685c145fc6b668c9f68af495413b55f03f2d57ffe4368: Status 404 returned error can't find the container with id a2175661aac9f3f3656685c145fc6b668c9f68af495413b55f03f2d57ffe4368 Apr 21 15:13:36.821915 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.821874 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xcldm" event={"ID":"2b8e98d1-29a2-44eb-9981-0c9d473a51aa","Type":"ContainerStarted","Data":"77839062be47e2904fd2baa6b6cf0b9fad62a044f197313a497a713a1938724c"} Apr 21 15:13:36.822388 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.821922 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xcldm" event={"ID":"2b8e98d1-29a2-44eb-9981-0c9d473a51aa","Type":"ContainerStarted","Data":"e392cd2dc7917c829d34a2bd0b9b68a5185b41f5c55e6dfcfec12ba962b24a27"} Apr 21 15:13:36.827051 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.823424 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"a2175661aac9f3f3656685c145fc6b668c9f68af495413b55f03f2d57ffe4368"} Apr 21 15:13:36.847565 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:36.847510 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xcldm" podStartSLOduration=3.211673962 podStartE2EDuration="3.847496072s" podCreationTimestamp="2026-04-21 15:13:33 +0000 UTC" firstStartedPulling="2026-04-21 15:13:34.138232489 +0000 UTC m=+178.544640737" lastFinishedPulling="2026-04-21 15:13:34.774054609 +0000 UTC m=+179.180462847" observedRunningTime="2026-04-21 15:13:36.846728469 +0000 UTC m=+181.253136729" watchObservedRunningTime="2026-04-21 15:13:36.847496072 +0000 UTC m=+181.253904332" Apr 21 15:13:37.897369 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.897304 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" containerID="cri-o://28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639" gracePeriod=30 Apr 21 15:13:37.945814 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.945776 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t"] Apr 21 15:13:37.947880 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.947864 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:37.950958 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.950918 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 15:13:37.951248 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.951234 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-z4tsn\"" Apr 21 15:13:37.960809 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.960781 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t"] Apr 21 15:13:37.966587 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:37.966561 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/576f8157-2364-4155-8a62-633a5c205ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cjh5t\" (UID: \"576f8157-2364-4155-8a62-633a5c205ce8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:38.067423 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.067386 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/576f8157-2364-4155-8a62-633a5c205ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cjh5t\" (UID: \"576f8157-2364-4155-8a62-633a5c205ce8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:38.067568 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:38.067531 2544 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 15:13:38.067615 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:38.067602 2544 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/576f8157-2364-4155-8a62-633a5c205ce8-monitoring-plugin-cert podName:576f8157-2364-4155-8a62-633a5c205ce8 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:38.567582878 +0000 UTC m=+182.973991117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/576f8157-2364-4155-8a62-633a5c205ce8-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-cjh5t" (UID: "576f8157-2364-4155-8a62-633a5c205ce8") : secret "monitoring-plugin-cert" not found Apr 21 15:13:38.138836 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.138804 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:13:38.167823 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167717 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjj5p\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-kube-api-access-fjj5p\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.167985 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167805 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.167985 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167876 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-image-registry-private-configuration\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.167985 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167906 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-bound-sa-token\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.167985 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167941 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-trusted-ca\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.167985 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167966 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4e6e922-a37d-437a-ac4e-43b6b4333c85-ca-trust-extracted\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.168258 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.167990 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-certificates\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.168258 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.168013 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-installation-pull-secrets\") pod \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\" (UID: \"f4e6e922-a37d-437a-ac4e-43b6b4333c85\") " Apr 21 15:13:38.168827 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.168549 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:13:38.168827 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.168785 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:13:38.170618 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.170590 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:13:38.170764 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.170715 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:13:38.170858 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.170636 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-kube-api-access-fjj5p" (OuterVolumeSpecName: "kube-api-access-fjj5p") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "kube-api-access-fjj5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:13:38.170982 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.170955 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:13:38.172054 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.172026 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:13:38.177957 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.177925 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e6e922-a37d-437a-ac4e-43b6b4333c85-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f4e6e922-a37d-437a-ac4e-43b6b4333c85" (UID: "f4e6e922-a37d-437a-ac4e-43b6b4333c85"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:13:38.269474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269435 2544 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjj5p\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-kube-api-access-fjj5p\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269468 2544 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-tls\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269478 2544 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-image-registry-private-configuration\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269487 2544 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e6e922-a37d-437a-ac4e-43b6b4333c85-bound-sa-token\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269498 2544 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-trusted-ca\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269506 2544 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f4e6e922-a37d-437a-ac4e-43b6b4333c85-ca-trust-extracted\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269980 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269514 2544 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f4e6e922-a37d-437a-ac4e-43b6b4333c85-registry-certificates\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.269980 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.269525 2544 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f4e6e922-a37d-437a-ac4e-43b6b4333c85-installation-pull-secrets\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:13:38.476070 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.475987 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-fcdccbb55-hzcjs"] Apr 21 15:13:38.476395 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.476380 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" Apr 21 15:13:38.476444 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.476400 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" Apr 21 15:13:38.476502 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.476491 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerName="registry" Apr 21 15:13:38.478939 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.478913 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.481872 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.481836 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 15:13:38.482704 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.482673 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 15:13:38.482847 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.482705 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 15:13:38.483031 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.483005 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 15:13:38.483122 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.483058 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-grf8t\"" Apr 21 15:13:38.483323 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.483302 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 15:13:38.490270 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.490246 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 15:13:38.493586 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.493563 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-fcdccbb55-hzcjs"] Apr 21 15:13:38.572052 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572014 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-secret-telemeter-client\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572234 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572072 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572234 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572103 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-telemeter-client-tls\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572234 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572133 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-serving-certs-ca-bundle\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572234 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572193 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-federate-client-tls\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572293 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-metrics-client-ca\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572366 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-922k8\" (UniqueName: \"kubernetes.io/projected/347281ad-1022-42b2-81fb-6af5bbb2235a-kube-api-access-922k8\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.572453 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572404 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/576f8157-2364-4155-8a62-633a5c205ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cjh5t\" (UID: \"576f8157-2364-4155-8a62-633a5c205ce8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:38.572577 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.572456 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.575234 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.575203 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/576f8157-2364-4155-8a62-633a5c205ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cjh5t\" (UID: \"576f8157-2364-4155-8a62-633a5c205ce8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:38.673463 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673414 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673631 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673476 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-telemeter-client-tls\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673631 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673513 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-serving-certs-ca-bundle\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673631 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673542 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-federate-client-tls\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673631 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673570 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-metrics-client-ca\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673870 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673631 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-922k8\" (UniqueName: \"kubernetes.io/projected/347281ad-1022-42b2-81fb-6af5bbb2235a-kube-api-access-922k8\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673870 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673685 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.673870 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.673732 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-secret-telemeter-client\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.674872 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.674835 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-serving-certs-ca-bundle\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.674872 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.674852 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-metrics-client-ca\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.675065 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.675049 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/347281ad-1022-42b2-81fb-6af5bbb2235a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.676626 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.676600 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-telemeter-client-tls\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.676775 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.676698 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-federate-client-tls\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.676775 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.676701 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.676851 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.676797 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/347281ad-1022-42b2-81fb-6af5bbb2235a-secret-telemeter-client\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.682629 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.682605 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-922k8\" (UniqueName: \"kubernetes.io/projected/347281ad-1022-42b2-81fb-6af5bbb2235a-kube-api-access-922k8\") pod \"telemeter-client-fcdccbb55-hzcjs\" (UID: \"347281ad-1022-42b2-81fb-6af5bbb2235a\") " pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.791897 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.791809 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" Apr 21 15:13:38.831732 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.831656 2544 generic.go:358] "Generic (PLEG): container finished" podID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" containerID="28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639" exitCode=0 Apr 21 15:13:38.831882 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.831730 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" Apr 21 15:13:38.831882 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.831732 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" event={"ID":"f4e6e922-a37d-437a-ac4e-43b6b4333c85","Type":"ContainerDied","Data":"28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639"} Apr 21 15:13:38.831882 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.831864 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d985b498b-8lcdx" event={"ID":"f4e6e922-a37d-437a-ac4e-43b6b4333c85","Type":"ContainerDied","Data":"8a81678837c51cd7d75cfd640087c76f1d9da3152c1159b6e8b2c45c050fa292"} Apr 21 15:13:38.832013 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.831883 2544 scope.go:117] "RemoveContainer" containerID="28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639" Apr 21 15:13:38.845607 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.845473 2544 scope.go:117] "RemoveContainer" containerID="28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639" Apr 21 15:13:38.847184 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:13:38.846910 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639\": container with ID starting with 28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639 not found: ID does not exist" containerID="28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639" Apr 21 15:13:38.847184 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.846951 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639"} err="failed to get container status \"28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639\": rpc error: code = NotFound desc = could not find container \"28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639\": container with ID starting with 28ae8fff3e1c88595a44832fda2d4a8eb0058e906a628addd46229b7b0cb5639 not found: ID does not exist" Apr 21 15:13:38.854967 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.854939 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d985b498b-8lcdx"] Apr 21 15:13:38.856674 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.856651 2544 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d985b498b-8lcdx"] Apr 21 15:13:38.867227 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.867200 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:38.938093 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:38.938052 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-fcdccbb55-hzcjs"] Apr 21 15:13:38.940990 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:38.940959 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod347281ad_1022_42b2_81fb_6af5bbb2235a.slice/crio-3efc909f116d7e6e4c42c606953c19635645f06cf1fdc1c49d0fbbdb453a88e3 WatchSource:0}: Error finding container 3efc909f116d7e6e4c42c606953c19635645f06cf1fdc1c49d0fbbdb453a88e3: Status 404 returned error can't find the container with id 3efc909f116d7e6e4c42c606953c19635645f06cf1fdc1c49d0fbbdb453a88e3 Apr 21 15:13:39.013232 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.013197 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t"] Apr 21 15:13:39.017613 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:39.017581 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576f8157_2364_4155_8a62_633a5c205ce8.slice/crio-b136344fad353c61d8bf594ed275a2dbad9326bfd6a28e20401600b9230ec7a3 WatchSource:0}: Error finding container b136344fad353c61d8bf594ed275a2dbad9326bfd6a28e20401600b9230ec7a3: Status 404 returned error can't find the container with id b136344fad353c61d8bf594ed275a2dbad9326bfd6a28e20401600b9230ec7a3 Apr 21 15:13:39.652848 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.650894 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:13:39.656589 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.656560 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.659900 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.659578 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 15:13:39.659900 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.659828 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 15:13:39.660091 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.659929 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 15:13:39.660091 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.659828 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 15:13:39.660219 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.660202 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 15:13:39.660291 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.660233 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 15:13:39.660344 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.660312 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-afv3kgpo9o632\"" Apr 21 15:13:39.660462 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.660448 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 15:13:39.661302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.661096 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 15:13:39.661302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.661120 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 15:13:39.661302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.661173 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 15:13:39.661302 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.661198 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-g4brn\"" Apr 21 15:13:39.661540 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.661470 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 15:13:39.663394 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.663374 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 15:13:39.667072 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.667049 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 15:13:39.670799 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.670773 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:13:39.684183 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684135 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-config\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684354 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684190 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lftd9\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-kube-api-access-lftd9\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684354 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684279 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684354 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684326 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684513 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684421 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684513 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684456 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684513 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684484 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684647 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684511 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-config-out\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684647 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684539 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684647 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684574 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684647 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684604 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684647 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684631 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684681 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684726 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-web-config\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684775 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684804 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684843 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.684909 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.684882 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786005 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.785965 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786159 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786022 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786159 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786057 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786159 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786093 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786159 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786135 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-config\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786334 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786162 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lftd9\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-kube-api-access-lftd9\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786334 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786193 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786334 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786224 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786334 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786287 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786334 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786318 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786343 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786368 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-config-out\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786390 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786426 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786453 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786479 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786540 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.786581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.786566 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-web-config\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.788957 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.788861 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.790443 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.789978 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.790443 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.790322 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.791342 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.791316 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.791949 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.791928 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.792088 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.792060 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793135 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.792595 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793135 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.792792 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793135 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.792804 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793135 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.793098 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793373 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.793225 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-config\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793764 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.793684 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-web-config\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.793764 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.793734 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.794125 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.794106 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-config-out\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.794781 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.794596 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.795123 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.795100 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.795963 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.795922 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.800143 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.800118 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lftd9\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-kube-api-access-lftd9\") pod \"prometheus-k8s-0\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:39.838905 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.838872 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"58f56246f52956ba959db2195e26fa8b048e6d95c127338606fbabf55489ae3b"} Apr 21 15:13:39.839013 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.838908 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"5ed74d6d1ad5e13157b40b44bf470549e86713fe87b51b6c9cc573b102e2f140"} Apr 21 15:13:39.839013 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.838918 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"688fae2d85482cb34944374aee3d728b6f26dba1234996ce56e2b45c0190b1c4"} Apr 21 15:13:39.839013 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.838927 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"2c23c83f1c0124bed2ea85b72d5445e24a7eedbf92c0d1281f250d211f82f83b"} Apr 21 15:13:39.840173 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.840146 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" event={"ID":"576f8157-2364-4155-8a62-633a5c205ce8","Type":"ContainerStarted","Data":"b136344fad353c61d8bf594ed275a2dbad9326bfd6a28e20401600b9230ec7a3"} Apr 21 15:13:39.841439 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.841404 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" event={"ID":"347281ad-1022-42b2-81fb-6af5bbb2235a","Type":"ContainerStarted","Data":"3efc909f116d7e6e4c42c606953c19635645f06cf1fdc1c49d0fbbdb453a88e3"} Apr 21 15:13:39.970660 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:39.970203 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:40.206672 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.206595 2544 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e6e922-a37d-437a-ac4e-43b6b4333c85" path="/var/lib/kubelet/pods/f4e6e922-a37d-437a-ac4e-43b6b4333c85/volumes" Apr 21 15:13:40.471938 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.471814 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:13:40.475913 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:13:40.475868 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5937c160_34fd_4393_a905_3aaf252c9e2e.slice/crio-bdde25fcbf7712af5d6ab18dabba6fd1233f497b7d619425b4fd80bea189918d WatchSource:0}: Error finding container bdde25fcbf7712af5d6ab18dabba6fd1233f497b7d619425b4fd80bea189918d: Status 404 returned error can't find the container with id bdde25fcbf7712af5d6ab18dabba6fd1233f497b7d619425b4fd80bea189918d Apr 21 15:13:40.846449 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.846397 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"bdde25fcbf7712af5d6ab18dabba6fd1233f497b7d619425b4fd80bea189918d"} Apr 21 15:13:40.849474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.849420 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"771df0780407a5f8679534d1a4381db9524e22a6dd466c0e8ce327586338df41"} Apr 21 15:13:40.849474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.849460 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" event={"ID":"99fb4416-714e-46d0-81ca-059cc9d4c30a","Type":"ContainerStarted","Data":"ee4c2dc75a998a7f1582a07df80f3290728fec804d483f01615c4276316cc512"} Apr 21 15:13:40.849684 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.849634 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:40.851039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.851013 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" event={"ID":"576f8157-2364-4155-8a62-633a5c205ce8","Type":"ContainerStarted","Data":"766292220bfeb4b4291c71d8c5320fadce88706fbcfab54b195ae645f0b61e5a"} Apr 21 15:13:40.851246 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.851225 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:40.857777 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.857737 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" Apr 21 15:13:40.885895 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.885829 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" podStartSLOduration=1.768920123 podStartE2EDuration="4.885810734s" podCreationTimestamp="2026-04-21 15:13:36 +0000 UTC" firstStartedPulling="2026-04-21 15:13:36.623049774 +0000 UTC m=+181.029458012" lastFinishedPulling="2026-04-21 15:13:39.739940367 +0000 UTC m=+184.146348623" observedRunningTime="2026-04-21 15:13:40.884152596 +0000 UTC m=+185.290560858" watchObservedRunningTime="2026-04-21 15:13:40.885810734 +0000 UTC m=+185.292218996" Apr 21 15:13:40.924665 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:40.924604 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cjh5t" podStartSLOduration=2.60936383 podStartE2EDuration="3.924589108s" podCreationTimestamp="2026-04-21 15:13:37 +0000 UTC" firstStartedPulling="2026-04-21 15:13:39.019705586 +0000 UTC m=+183.426113824" lastFinishedPulling="2026-04-21 15:13:40.334930858 +0000 UTC m=+184.741339102" observedRunningTime="2026-04-21 15:13:40.922642956 +0000 UTC m=+185.329051216" watchObservedRunningTime="2026-04-21 15:13:40.924589108 +0000 UTC m=+185.330997367" Apr 21 15:13:41.856538 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:41.856480 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" event={"ID":"347281ad-1022-42b2-81fb-6af5bbb2235a","Type":"ContainerStarted","Data":"22e2e0cf36478ccf2847d2256b091e81c294dbc156487cdb19e4e4e316fad101"} Apr 21 15:13:41.856538 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:41.856543 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" event={"ID":"347281ad-1022-42b2-81fb-6af5bbb2235a","Type":"ContainerStarted","Data":"77178db1d3a6c2c56745014cdbdcba63cb012a2ad936b402d16a508db204e046"} Apr 21 15:13:41.857058 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:41.856559 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" event={"ID":"347281ad-1022-42b2-81fb-6af5bbb2235a","Type":"ContainerStarted","Data":"1fcf4ef9c703362f46259af34a15b67e432fae0e03c3547fca167fa9e4fdd1ba"} Apr 21 15:13:41.857957 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:41.857886 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3" exitCode=0 Apr 21 15:13:41.858041 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:41.857972 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3"} Apr 21 15:13:41.884557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:41.884500 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-fcdccbb55-hzcjs" podStartSLOduration=1.213340308 podStartE2EDuration="3.884484083s" podCreationTimestamp="2026-04-21 15:13:38 +0000 UTC" firstStartedPulling="2026-04-21 15:13:38.943500303 +0000 UTC m=+183.349908541" lastFinishedPulling="2026-04-21 15:13:41.614644063 +0000 UTC m=+186.021052316" observedRunningTime="2026-04-21 15:13:41.881880141 +0000 UTC m=+186.288288400" watchObservedRunningTime="2026-04-21 15:13:41.884484083 +0000 UTC m=+186.290892342" Apr 21 15:13:44.870658 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:44.870618 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668"} Apr 21 15:13:44.871071 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:44.870666 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf"} Apr 21 15:13:45.876638 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:45.876602 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6"} Apr 21 15:13:45.876638 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:45.876640 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10"} Apr 21 15:13:45.877058 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:45.876652 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd"} Apr 21 15:13:45.877058 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:45.876661 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerStarted","Data":"03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568"} Apr 21 15:13:45.913184 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:45.913126 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.695849622 podStartE2EDuration="6.91310934s" podCreationTimestamp="2026-04-21 15:13:39 +0000 UTC" firstStartedPulling="2026-04-21 15:13:40.478160591 +0000 UTC m=+184.884568828" lastFinishedPulling="2026-04-21 15:13:44.695420305 +0000 UTC m=+189.101828546" observedRunningTime="2026-04-21 15:13:45.911386831 +0000 UTC m=+190.317795092" watchObservedRunningTime="2026-04-21 15:13:45.91310934 +0000 UTC m=+190.319517599" Apr 21 15:13:46.865802 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:46.865772 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-745cb96956-h6vct" Apr 21 15:13:49.971498 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:49.971446 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:13:54.706739 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:54.706693 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77b56f58d-jkshr"] Apr 21 15:13:54.907385 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:54.907289 2544 generic.go:358] "Generic (PLEG): container finished" podID="f2e49b4d-d05c-4693-9ac9-190627c56f55" containerID="3fc6a4e059e9a531a438f55d4f68ba89e43c66024c8f7e129903f5bc50d9bac6" exitCode=0 Apr 21 15:13:54.907385 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:54.907348 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" event={"ID":"f2e49b4d-d05c-4693-9ac9-190627c56f55","Type":"ContainerDied","Data":"3fc6a4e059e9a531a438f55d4f68ba89e43c66024c8f7e129903f5bc50d9bac6"} Apr 21 15:13:54.907747 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:54.907728 2544 scope.go:117] "RemoveContainer" containerID="3fc6a4e059e9a531a438f55d4f68ba89e43c66024c8f7e129903f5bc50d9bac6" Apr 21 15:13:55.913212 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:13:55.913169 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-n8zqd" event={"ID":"f2e49b4d-d05c-4693-9ac9-190627c56f55","Type":"ContainerStarted","Data":"c7a0d76d5f01fc36c3400f83078ee97878a00db119be2810b3db6b8a929763ab"} Apr 21 15:14:19.726145 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.726078 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77b56f58d-jkshr" podUID="197819d1-2ff2-4a0d-a8b1-2a74d85a053e" containerName="console" containerID="cri-o://3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9" gracePeriod=15 Apr 21 15:14:19.971119 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.971090 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77b56f58d-jkshr_197819d1-2ff2-4a0d-a8b1-2a74d85a053e/console/0.log" Apr 21 15:14:19.971272 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.971163 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:14:19.983677 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.983598 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77b56f58d-jkshr_197819d1-2ff2-4a0d-a8b1-2a74d85a053e/console/0.log" Apr 21 15:14:19.983677 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.983639 2544 generic.go:358] "Generic (PLEG): container finished" podID="197819d1-2ff2-4a0d-a8b1-2a74d85a053e" containerID="3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9" exitCode=2 Apr 21 15:14:19.983904 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.983718 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77b56f58d-jkshr" Apr 21 15:14:19.983904 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.983721 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b56f58d-jkshr" event={"ID":"197819d1-2ff2-4a0d-a8b1-2a74d85a053e","Type":"ContainerDied","Data":"3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9"} Apr 21 15:14:19.983904 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.983784 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77b56f58d-jkshr" event={"ID":"197819d1-2ff2-4a0d-a8b1-2a74d85a053e","Type":"ContainerDied","Data":"c6ef084a465ff8924ff845b1b429905c4be6342a4d2b3c296d8c009601c4949a"} Apr 21 15:14:19.983904 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.983803 2544 scope.go:117] "RemoveContainer" containerID="3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9" Apr 21 15:14:19.991578 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.991546 2544 scope.go:117] "RemoveContainer" containerID="3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9" Apr 21 15:14:19.992140 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:14:19.992112 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9\": container with ID starting with 3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9 not found: ID does not exist" containerID="3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9" Apr 21 15:14:19.992237 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:19.992150 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9"} err="failed to get container status \"3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9\": rpc error: code = NotFound desc = could not find container \"3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9\": container with ID starting with 3ffbfdaa25b2caa011e68f81dd18f78dc8049b1415f6b728fb5985f2b0911ba9 not found: ID does not exist" Apr 21 15:14:20.061546 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061495 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-oauth-serving-cert\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.061546 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061562 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-serving-cert\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.061866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061615 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-trusted-ca-bundle\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.061866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061657 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-oauth-config\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.061866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061696 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzvf\" (UniqueName: \"kubernetes.io/projected/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-kube-api-access-ktzvf\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.061866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061726 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-config\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.061866 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.061789 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-service-ca\") pod \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\" (UID: \"197819d1-2ff2-4a0d-a8b1-2a74d85a053e\") " Apr 21 15:14:20.062205 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.062165 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-config" (OuterVolumeSpecName: "console-config") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:20.062287 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.062200 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:20.062346 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.062294 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-service-ca" (OuterVolumeSpecName: "service-ca") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:20.062346 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.062300 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:20.063947 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.063919 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:20.064066 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.063972 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:20.064125 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.064080 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-kube-api-access-ktzvf" (OuterVolumeSpecName: "kube-api-access-ktzvf") pod "197819d1-2ff2-4a0d-a8b1-2a74d85a053e" (UID: "197819d1-2ff2-4a0d-a8b1-2a74d85a053e"). InnerVolumeSpecName "kube-api-access-ktzvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:20.162883 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162827 2544 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-service-ca\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.162883 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162874 2544 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-oauth-serving-cert\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.162883 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162885 2544 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-serving-cert\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.162883 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162897 2544 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-trusted-ca-bundle\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.163151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162906 2544 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-oauth-config\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.163151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162916 2544 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ktzvf\" (UniqueName: \"kubernetes.io/projected/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-kube-api-access-ktzvf\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.163151 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.162927 2544 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/197819d1-2ff2-4a0d-a8b1-2a74d85a053e-console-config\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:14:20.305680 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.305647 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77b56f58d-jkshr"] Apr 21 15:14:20.314879 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:20.314848 2544 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77b56f58d-jkshr"] Apr 21 15:14:22.205887 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:22.205849 2544 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197819d1-2ff2-4a0d-a8b1-2a74d85a053e" path="/var/lib/kubelet/pods/197819d1-2ff2-4a0d-a8b1-2a74d85a053e/volumes" Apr 21 15:14:39.970973 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:39.970927 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:14:39.987374 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:39.987347 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:14:40.054299 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:40.054269 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:14:46.997400 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:46.997339 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:14:46.999700 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:46.999675 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428-metrics-certs\") pod \"network-metrics-daemon-84hkv\" (UID: \"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428\") " pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:14:47.106221 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:47.106178 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zqtzq\"" Apr 21 15:14:47.113330 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:47.113302 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-84hkv" Apr 21 15:14:47.248743 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:47.248714 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-84hkv"] Apr 21 15:14:47.250639 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:14:47.250605 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed47f4b_eaf7_4ffc_8127_1dcfeabe3428.slice/crio-fec7644b969f8a99457603541bb516d68702793aba1b450127e1cc1f923dbfff WatchSource:0}: Error finding container fec7644b969f8a99457603541bb516d68702793aba1b450127e1cc1f923dbfff: Status 404 returned error can't find the container with id fec7644b969f8a99457603541bb516d68702793aba1b450127e1cc1f923dbfff Apr 21 15:14:48.062498 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:48.062464 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-84hkv" event={"ID":"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428","Type":"ContainerStarted","Data":"fec7644b969f8a99457603541bb516d68702793aba1b450127e1cc1f923dbfff"} Apr 21 15:14:49.066868 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:49.066831 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-84hkv" event={"ID":"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428","Type":"ContainerStarted","Data":"b10b7f4464df6f9a9e74705bca8fe3b8644926e5a440fd8959ebb9874259da0e"} Apr 21 15:14:49.066868 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:49.066867 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-84hkv" event={"ID":"3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428","Type":"ContainerStarted","Data":"be998c11a146aad97163476973354307227748331110e52ae75e7f673102e174"} Apr 21 15:14:49.100774 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:49.100703 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-84hkv" podStartSLOduration=252.131450766 podStartE2EDuration="4m13.100686621s" podCreationTimestamp="2026-04-21 15:10:36 +0000 UTC" firstStartedPulling="2026-04-21 15:14:47.252558421 +0000 UTC m=+251.658966659" lastFinishedPulling="2026-04-21 15:14:48.221794273 +0000 UTC m=+252.628202514" observedRunningTime="2026-04-21 15:14:49.099587978 +0000 UTC m=+253.505996248" watchObservedRunningTime="2026-04-21 15:14:49.100686621 +0000 UTC m=+253.507094881" Apr 21 15:14:58.062963 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.062870 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:14:58.063523 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.063488 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="prometheus" containerID="cri-o://84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" gracePeriod=600 Apr 21 15:14:58.063685 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.063515 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-web" containerID="cri-o://b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd" gracePeriod=600 Apr 21 15:14:58.063685 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.063516 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="thanos-sidecar" containerID="cri-o://03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568" gracePeriod=600 Apr 21 15:14:58.063685 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.063553 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="config-reloader" containerID="cri-o://babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668" gracePeriod=600 Apr 21 15:14:58.063685 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.063617 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6" gracePeriod=600 Apr 21 15:14:58.063685 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:58.063519 2544 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy" containerID="cri-o://3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10" gracePeriod=600 Apr 21 15:14:59.097842 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:59.097806 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd" exitCode=0 Apr 21 15:14:59.097842 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:59.097838 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568" exitCode=0 Apr 21 15:14:59.097842 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:59.097845 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668" exitCode=0 Apr 21 15:14:59.098274 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:59.097846 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd"} Apr 21 15:14:59.098274 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:59.097879 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568"} Apr 21 15:14:59.098274 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:14:59.097889 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668"} Apr 21 15:15:00.041249 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:00.041204 2544 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Apr 21 15:15:00.042155 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:00.042122 2544 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Apr 21 15:15:00.042985 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:00.042953 2544 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Apr 21 15:15:00.043041 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:00.042999 2544 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="prometheus" probeResult="unknown" Apr 21 15:15:00.104445 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:00.104410 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6" exitCode=0 Apr 21 15:15:00.104445 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:00.104438 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10" exitCode=0 Apr 21 15:15:00.104859 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:00.104482 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6"} Apr 21 15:15:00.104859 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:00.104514 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10"} Apr 21 15:15:04.611544 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.611516 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:04.649432 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649390 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lftd9\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-kube-api-access-lftd9\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649432 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649435 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-rulefiles-0\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649455 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-thanos-prometheus-http-client-file\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649473 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649500 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-db\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649519 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-metrics-client-certs\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649535 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-grpc-tls\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649561 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-kube-rbac-proxy\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649591 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-config-out\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649618 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-serving-certs-ca-bundle\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.649663 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649642 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-tls\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649671 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-trusted-ca-bundle\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649693 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-metrics-client-ca\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649716 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649780 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-kubelet-serving-ca-bundle\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649812 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-tls-assets\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649843 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-config\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.649865 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-web-config\") pod \"5937c160-34fd-4393-a905-3aaf252c9e2e\" (UID: \"5937c160-34fd-4393-a905-3aaf252c9e2e\") " Apr 21 15:15:04.650778 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.650444 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:04.650902 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.650778 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:15:04.651627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.651576 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:04.652571 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.652538 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:04.652886 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.652854 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:04.653202 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.652994 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:04.653450 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.653394 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.653450 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.653399 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-kube-api-access-lftd9" (OuterVolumeSpecName: "kube-api-access-lftd9") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "kube-api-access-lftd9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:15:04.653450 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.653427 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.653941 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.653902 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.654057 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.654009 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.654057 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.654030 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-config-out" (OuterVolumeSpecName: "config-out") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:15:04.655020 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.654989 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.655117 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.654997 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:15:04.655655 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.655632 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.655764 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.655662 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-config" (OuterVolumeSpecName: "config") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.656297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.656272 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.663886 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.663852 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-web-config" (OuterVolumeSpecName: "web-config") pod "5937c160-34fd-4393-a905-3aaf252c9e2e" (UID: "5937c160-34fd-4393-a905-3aaf252c9e2e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750382 2544 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750421 2544 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750432 2544 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750442 2544 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-k8s-db\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750453 2544 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-metrics-client-certs\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750461 2544 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-grpc-tls\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750470 2544 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-kube-rbac-proxy\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750479 2544 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5937c160-34fd-4393-a905-3aaf252c9e2e-config-out\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750480 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750487 2544 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750498 2544 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750506 2544 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750516 2544 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-metrics-client-ca\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750524 2544 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750534 2544 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5937c160-34fd-4393-a905-3aaf252c9e2e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750542 2544 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-tls-assets\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750551 2544 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-config\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750558 2544 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5937c160-34fd-4393-a905-3aaf252c9e2e-web-config\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:04.750908 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:04.750567 2544 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lftd9\" (UniqueName: \"kubernetes.io/projected/5937c160-34fd-4393-a905-3aaf252c9e2e-kube-api-access-lftd9\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:15:05.122468 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.122437 2544 generic.go:358] "Generic (PLEG): container finished" podID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" exitCode=0 Apr 21 15:15:05.122651 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.122530 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf"} Apr 21 15:15:05.122651 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.122574 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5937c160-34fd-4393-a905-3aaf252c9e2e","Type":"ContainerDied","Data":"bdde25fcbf7712af5d6ab18dabba6fd1233f497b7d619425b4fd80bea189918d"} Apr 21 15:15:05.122651 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.122599 2544 scope.go:117] "RemoveContainer" containerID="d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6" Apr 21 15:15:05.122651 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.122542 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.135713 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.135681 2544 scope.go:117] "RemoveContainer" containerID="3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10" Apr 21 15:15:05.143299 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.143281 2544 scope.go:117] "RemoveContainer" containerID="b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd" Apr 21 15:15:05.149991 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.149970 2544 scope.go:117] "RemoveContainer" containerID="03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568" Apr 21 15:15:05.155210 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.155185 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:15:05.158364 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.158340 2544 scope.go:117] "RemoveContainer" containerID="babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668" Apr 21 15:15:05.163724 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.163698 2544 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:15:05.165665 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.165649 2544 scope.go:117] "RemoveContainer" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" Apr 21 15:15:05.172995 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.172969 2544 scope.go:117] "RemoveContainer" containerID="8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3" Apr 21 15:15:05.179699 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.179675 2544 scope.go:117] "RemoveContainer" containerID="d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6" Apr 21 15:15:05.180015 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.179995 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6\": container with ID starting with d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6 not found: ID does not exist" containerID="d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6" Apr 21 15:15:05.180080 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180027 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6"} err="failed to get container status \"d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6\": rpc error: code = NotFound desc = could not find container \"d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6\": container with ID starting with d7b715019f095b4abe03bbb106249af7e916b07acc487b982abefc061026e9a6 not found: ID does not exist" Apr 21 15:15:05.180080 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180050 2544 scope.go:117] "RemoveContainer" containerID="3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10" Apr 21 15:15:05.180298 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.180285 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10\": container with ID starting with 3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10 not found: ID does not exist" containerID="3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10" Apr 21 15:15:05.180340 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180300 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10"} err="failed to get container status \"3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10\": rpc error: code = NotFound desc = could not find container \"3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10\": container with ID starting with 3f3661eec5fdf9b7943ed3111503fc8d2b2c426b8775eb85e532b88408323e10 not found: ID does not exist" Apr 21 15:15:05.180340 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180312 2544 scope.go:117] "RemoveContainer" containerID="b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd" Apr 21 15:15:05.180493 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.180480 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd\": container with ID starting with b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd not found: ID does not exist" containerID="b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd" Apr 21 15:15:05.180529 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180495 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd"} err="failed to get container status \"b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd\": rpc error: code = NotFound desc = could not find container \"b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd\": container with ID starting with b05b39b54db8279f2f78b86e2dbc621c8cd598dbca524b8dba22875d7bb7d3cd not found: ID does not exist" Apr 21 15:15:05.180529 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180507 2544 scope.go:117] "RemoveContainer" containerID="03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568" Apr 21 15:15:05.180733 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.180713 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568\": container with ID starting with 03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568 not found: ID does not exist" containerID="03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568" Apr 21 15:15:05.180870 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180742 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568"} err="failed to get container status \"03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568\": rpc error: code = NotFound desc = could not find container \"03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568\": container with ID starting with 03954d024714e1e19b6978818ee66216657e3e33dde234c81b2710486cd4f568 not found: ID does not exist" Apr 21 15:15:05.180870 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180776 2544 scope.go:117] "RemoveContainer" containerID="babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668" Apr 21 15:15:05.180993 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.180975 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668\": container with ID starting with babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668 not found: ID does not exist" containerID="babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668" Apr 21 15:15:05.181030 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.180998 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668"} err="failed to get container status \"babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668\": rpc error: code = NotFound desc = could not find container \"babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668\": container with ID starting with babd3f7738f5c3ff01ee77e6caacd78041d708513d6e1a33d8cc059366c0d668 not found: ID does not exist" Apr 21 15:15:05.181030 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.181011 2544 scope.go:117] "RemoveContainer" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" Apr 21 15:15:05.181215 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.181194 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf\": container with ID starting with 84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf not found: ID does not exist" containerID="84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf" Apr 21 15:15:05.181278 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.181225 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf"} err="failed to get container status \"84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf\": rpc error: code = NotFound desc = could not find container \"84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf\": container with ID starting with 84268ed1c689a9b5da25fe12149f07fc8903ce2ba707e957bbd8e70215956fcf not found: ID does not exist" Apr 21 15:15:05.181278 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.181246 2544 scope.go:117] "RemoveContainer" containerID="8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3" Apr 21 15:15:05.181475 ip-10-0-137-168 kubenswrapper[2544]: E0421 15:15:05.181444 2544 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3\": container with ID starting with 8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3 not found: ID does not exist" containerID="8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3" Apr 21 15:15:05.181475 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.181466 2544 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3"} err="failed to get container status \"8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3\": rpc error: code = NotFound desc = could not find container \"8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3\": container with ID starting with 8dff275172d7191792d3893eaa8f5a2548b6d616c100cfd979aac6f5bc453bd3 not found: ID does not exist" Apr 21 15:15:05.203525 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203492 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:15:05.203837 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203824 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="config-reloader" Apr 21 15:15:05.203894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203840 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="config-reloader" Apr 21 15:15:05.203894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203851 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="init-config-reloader" Apr 21 15:15:05.203894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203856 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="init-config-reloader" Apr 21 15:15:05.203894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203862 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="thanos-sidecar" Apr 21 15:15:05.203894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203869 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="thanos-sidecar" Apr 21 15:15:05.203894 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203894 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-thanos" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203900 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-thanos" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203912 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="197819d1-2ff2-4a0d-a8b1-2a74d85a053e" containerName="console" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203917 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="197819d1-2ff2-4a0d-a8b1-2a74d85a053e" containerName="console" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203924 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-web" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203929 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-web" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203937 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="prometheus" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203943 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="prometheus" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203952 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.203958 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204001 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="197819d1-2ff2-4a0d-a8b1-2a74d85a053e" containerName="console" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204009 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="config-reloader" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204016 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-web" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204024 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="prometheus" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204030 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="thanos-sidecar" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204036 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy" Apr 21 15:15:05.204069 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.204043 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" containerName="kube-rbac-proxy-thanos" Apr 21 15:15:05.209512 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.209488 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.216047 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.216017 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 15:15:05.216241 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.216224 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 15:15:05.216456 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.216435 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 15:15:05.217527 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217498 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-g4brn\"" Apr 21 15:15:05.217609 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217585 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 15:15:05.217609 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217586 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 15:15:05.217721 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217628 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 15:15:05.217721 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217698 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 15:15:05.217938 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217924 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 15:15:05.217983 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217931 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 15:15:05.217983 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.217964 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-afv3kgpo9o632\"" Apr 21 15:15:05.218218 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.218179 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 15:15:05.219653 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.219633 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 15:15:05.219826 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.219776 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 15:15:05.223829 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.223808 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 15:15:05.236298 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.236267 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:15:05.254081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254042 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dba9102-9c62-4c6e-92db-d441db822372-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254081 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254083 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254145 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254163 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254187 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-config\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254203 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254253 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254286 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmsnw\" (UniqueName: \"kubernetes.io/projected/3dba9102-9c62-4c6e-92db-d441db822372-kube-api-access-rmsnw\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254311 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dba9102-9c62-4c6e-92db-d441db822372-config-out\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254332 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254351 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254426 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254473 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-web-config\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254496 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254518 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254553 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254538 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254829 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254564 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.254829 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.254592 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.355694 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355650 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.355694 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355691 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.355948 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355717 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.355948 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355782 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.355948 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355912 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356105 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355966 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dba9102-9c62-4c6e-92db-d441db822372-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356105 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.355995 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356196 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356134 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356196 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356177 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356295 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356232 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-config\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356295 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356240 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.356295 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356254 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356557 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356609 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmsnw\" (UniqueName: \"kubernetes.io/projected/3dba9102-9c62-4c6e-92db-d441db822372-kube-api-access-rmsnw\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356650 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dba9102-9c62-4c6e-92db-d441db822372-config-out\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356686 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356722 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356775 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356807 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-web-config\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.356878 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.357719 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.358939 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.358908 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.359029 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.358964 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.359204 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.359178 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.359667 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.359318 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dba9102-9c62-4c6e-92db-d441db822372-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.359667 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.359514 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-config\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.359667 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.359659 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.360094 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.360069 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.360202 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.360171 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dba9102-9c62-4c6e-92db-d441db822372-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.360448 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.360425 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-web-config\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.361186 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.361165 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.361520 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.361503 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.361623 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.361593 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.361701 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.361643 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dba9102-9c62-4c6e-92db-d441db822372-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.361742 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.361722 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dba9102-9c62-4c6e-92db-d441db822372-config-out\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.371343 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.371314 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmsnw\" (UniqueName: \"kubernetes.io/projected/3dba9102-9c62-4c6e-92db-d441db822372-kube-api-access-rmsnw\") pod \"prometheus-k8s-0\" (UID: \"3dba9102-9c62-4c6e-92db-d441db822372\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.519474 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.519431 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:05.688097 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:05.688056 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:15:05.695560 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:15:05.695515 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dba9102_9c62_4c6e_92db_d441db822372.slice/crio-153b5f3f3c44fc49dc52d51977e3d5122b7cfb852d0f8ef678e91988f7aeed89 WatchSource:0}: Error finding container 153b5f3f3c44fc49dc52d51977e3d5122b7cfb852d0f8ef678e91988f7aeed89: Status 404 returned error can't find the container with id 153b5f3f3c44fc49dc52d51977e3d5122b7cfb852d0f8ef678e91988f7aeed89 Apr 21 15:15:06.127445 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:06.127405 2544 generic.go:358] "Generic (PLEG): container finished" podID="3dba9102-9c62-4c6e-92db-d441db822372" containerID="4d4545df7f4b9ce118d322b9b01909be8c8bdc762cd0e94735add5c396cf61b4" exitCode=0 Apr 21 15:15:06.127633 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:06.127496 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerDied","Data":"4d4545df7f4b9ce118d322b9b01909be8c8bdc762cd0e94735add5c396cf61b4"} Apr 21 15:15:06.127633 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:06.127538 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"153b5f3f3c44fc49dc52d51977e3d5122b7cfb852d0f8ef678e91988f7aeed89"} Apr 21 15:15:06.211044 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:06.208190 2544 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5937c160-34fd-4393-a905-3aaf252c9e2e" path="/var/lib/kubelet/pods/5937c160-34fd-4393-a905-3aaf252c9e2e/volumes" Apr 21 15:15:07.133613 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.133574 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"5a0d36a327ac340f16ee1c2562a2702a2a4c04a119c9a13cb411d5727d29b10c"} Apr 21 15:15:07.133613 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.133616 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"15808e511549a5a24e1225aa30efdb795c784ed45dc6451172808475cf484a1b"} Apr 21 15:15:07.134068 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.133634 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"1397f218d1a82eb9e224dcb68e6f85a4aeaca525cdce16b0f977bc88fc6975e3"} Apr 21 15:15:07.134068 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.133667 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"1c9c914058475f5962dd22737c302129bf856c6af0880a404a3ef18e57becd9b"} Apr 21 15:15:07.134068 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.133680 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"f194f4c9fa0cc451a534647fcdcf1f66a532e2fdb4d2ca625c72ce9999c53569"} Apr 21 15:15:07.134068 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.133691 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dba9102-9c62-4c6e-92db-d441db822372","Type":"ContainerStarted","Data":"57dd35ef10901e49d6b0dea056fd23707614e1b93eb734efcebf714e79a4d66d"} Apr 21 15:15:07.180785 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:07.180701 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.180681278 podStartE2EDuration="2.180681278s" podCreationTimestamp="2026-04-21 15:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:15:07.178868657 +0000 UTC m=+271.585276931" watchObservedRunningTime="2026-04-21 15:15:07.180681278 +0000 UTC m=+271.587089540" Apr 21 15:15:10.520400 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:10.520362 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:15:15.544143 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.544102 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vvx6b"] Apr 21 15:15:15.548174 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.548154 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.551089 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.551066 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:15:15.571776 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.558781 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vvx6b"] Apr 21 15:15:15.640051 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.640010 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/800736cd-155f-42b3-abfc-1ca602be5ccf-original-pull-secret\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.640232 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.640071 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/800736cd-155f-42b3-abfc-1ca602be5ccf-kubelet-config\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.640232 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.640128 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/800736cd-155f-42b3-abfc-1ca602be5ccf-dbus\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.741048 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.741010 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/800736cd-155f-42b3-abfc-1ca602be5ccf-original-pull-secret\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.741156 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.741066 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/800736cd-155f-42b3-abfc-1ca602be5ccf-kubelet-config\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.741156 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.741138 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/800736cd-155f-42b3-abfc-1ca602be5ccf-kubelet-config\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.741224 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.741181 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/800736cd-155f-42b3-abfc-1ca602be5ccf-dbus\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.741363 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.741346 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/800736cd-155f-42b3-abfc-1ca602be5ccf-dbus\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.743427 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.743409 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/800736cd-155f-42b3-abfc-1ca602be5ccf-original-pull-secret\") pod \"global-pull-secret-syncer-vvx6b\" (UID: \"800736cd-155f-42b3-abfc-1ca602be5ccf\") " pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.857636 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.857599 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vvx6b" Apr 21 15:15:15.978469 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:15.978432 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vvx6b"] Apr 21 15:15:15.982762 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:15:15.982715 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800736cd_155f_42b3_abfc_1ca602be5ccf.slice/crio-ccb876490cf36853feca37e71c1cbace7386714df58ae9b4464cec43859d4c9a WatchSource:0}: Error finding container ccb876490cf36853feca37e71c1cbace7386714df58ae9b4464cec43859d4c9a: Status 404 returned error can't find the container with id ccb876490cf36853feca37e71c1cbace7386714df58ae9b4464cec43859d4c9a Apr 21 15:15:16.163156 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:16.163074 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vvx6b" event={"ID":"800736cd-155f-42b3-abfc-1ca602be5ccf","Type":"ContainerStarted","Data":"ccb876490cf36853feca37e71c1cbace7386714df58ae9b4464cec43859d4c9a"} Apr 21 15:15:21.186280 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:21.186232 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vvx6b" event={"ID":"800736cd-155f-42b3-abfc-1ca602be5ccf","Type":"ContainerStarted","Data":"f037ca46ab5ce2031c7a8e0a79025c186574a64b60be59cf0133f1285a95e125"} Apr 21 15:15:21.206347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:21.206290 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vvx6b" podStartSLOduration=2.0265846 podStartE2EDuration="6.206272007s" podCreationTimestamp="2026-04-21 15:15:15 +0000 UTC" firstStartedPulling="2026-04-21 15:15:15.984686499 +0000 UTC m=+280.391094736" lastFinishedPulling="2026-04-21 15:15:20.164373906 +0000 UTC m=+284.570782143" observedRunningTime="2026-04-21 15:15:21.205623308 +0000 UTC m=+285.612031567" watchObservedRunningTime="2026-04-21 15:15:21.206272007 +0000 UTC m=+285.612680266" Apr 21 15:15:36.087906 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:36.087874 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:15:36.088297 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:36.088185 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:15:36.093307 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:15:36.093286 2544 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:16:05.519613 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:16:05.519580 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:16:05.535019 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:16:05.534993 2544 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:16:06.338332 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:16:06.338298 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:17:20.361557 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.361520 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg"] Apr 21 15:17:20.365003 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.364980 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.370491 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.370464 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 21 15:17:20.370645 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.370593 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 15:17:20.370705 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.370656 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 21 15:17:20.372170 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.372136 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-bbxlt\"" Apr 21 15:17:20.372170 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.372159 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 15:17:20.385826 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.385791 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg"] Apr 21 15:17:20.462460 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.462425 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.462460 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.462466 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-cert\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.462692 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.462488 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dfvh\" (UniqueName: \"kubernetes.io/projected/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-kube-api-access-8dfvh\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.563114 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.563078 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.563114 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.563120 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-cert\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.563319 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.563148 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dfvh\" (UniqueName: \"kubernetes.io/projected/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-kube-api-access-8dfvh\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.563819 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.563798 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.565466 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.565440 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-cert\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.578108 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.578075 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dfvh\" (UniqueName: \"kubernetes.io/projected/6228cad9-195b-4e32-8aa6-b6a4779ea6ee-kube-api-access-8dfvh\") pod \"kubeflow-trainer-controller-manager-6f49596498-xxpmg\" (UID: \"6228cad9-195b-4e32-8aa6-b6a4779ea6ee\") " pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.674135 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.674042 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:20.800257 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.800229 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg"] Apr 21 15:17:20.802498 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:17:20.802465 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6228cad9_195b_4e32_8aa6_b6a4779ea6ee.slice/crio-ac14eb122e8c819e9c039baabdb4a19d3a9da7455439aaa5fc09628c0bc4413f WatchSource:0}: Error finding container ac14eb122e8c819e9c039baabdb4a19d3a9da7455439aaa5fc09628c0bc4413f: Status 404 returned error can't find the container with id ac14eb122e8c819e9c039baabdb4a19d3a9da7455439aaa5fc09628c0bc4413f Apr 21 15:17:20.804341 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:20.804324 2544 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:17:21.537560 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:21.537515 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" event={"ID":"6228cad9-195b-4e32-8aa6-b6a4779ea6ee","Type":"ContainerStarted","Data":"ac14eb122e8c819e9c039baabdb4a19d3a9da7455439aaa5fc09628c0bc4413f"} Apr 21 15:17:23.545551 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:23.545454 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" event={"ID":"6228cad9-195b-4e32-8aa6-b6a4779ea6ee","Type":"ContainerStarted","Data":"86ef1d5439ff38ca71539c16fdf94e8d67946523709664f515ddb272cbd6edbf"} Apr 21 15:17:23.545956 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:23.545592 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:17:23.566836 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:23.566775 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" podStartSLOduration=1.157665483 podStartE2EDuration="3.566740791s" podCreationTimestamp="2026-04-21 15:17:20 +0000 UTC" firstStartedPulling="2026-04-21 15:17:20.804450349 +0000 UTC m=+405.210858587" lastFinishedPulling="2026-04-21 15:17:23.213525657 +0000 UTC m=+407.619933895" observedRunningTime="2026-04-21 15:17:23.564119625 +0000 UTC m=+407.970527889" watchObservedRunningTime="2026-04-21 15:17:23.566740791 +0000 UTC m=+407.973149054" Apr 21 15:17:39.553150 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:17:39.553109 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-6f49596498-xxpmg" Apr 21 15:20:36.110592 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:20:36.110554 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:20:36.111898 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:20:36.111872 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:22:28.050700 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.050602 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww"] Apr 21 15:22:28.054039 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.054016 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:22:28.056675 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.056651 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"openshift-service-ca.crt\"" Apr 21 15:22:28.056811 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.056671 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"kube-root-ca.crt\"" Apr 21 15:22:28.056923 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.056909 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-92tmr\"/\"default-dockercfg-hkrtq\"" Apr 21 15:22:28.063533 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.063499 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww"] Apr 21 15:22:28.104025 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.103980 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnkc\" (UniqueName: \"kubernetes.io/projected/931ec68b-f510-47d2-937a-72212ea680cb-kube-api-access-svnkc\") pod \"progression-custom-config-node-0-0-6wvww\" (UID: \"931ec68b-f510-47d2-937a-72212ea680cb\") " pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:22:28.204456 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.204426 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svnkc\" (UniqueName: \"kubernetes.io/projected/931ec68b-f510-47d2-937a-72212ea680cb-kube-api-access-svnkc\") pod \"progression-custom-config-node-0-0-6wvww\" (UID: \"931ec68b-f510-47d2-937a-72212ea680cb\") " pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:22:28.213020 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.212986 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnkc\" (UniqueName: \"kubernetes.io/projected/931ec68b-f510-47d2-937a-72212ea680cb-kube-api-access-svnkc\") pod \"progression-custom-config-node-0-0-6wvww\" (UID: \"931ec68b-f510-47d2-937a-72212ea680cb\") " pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:22:28.364673 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.364631 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:22:28.492565 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.492539 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww"] Apr 21 15:22:28.495441 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:22:28.495408 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931ec68b_f510_47d2_937a_72212ea680cb.slice/crio-f169bf99363695cf413ae8cad1cfa714cecc8119d8b91f107ec377f291310bc0 WatchSource:0}: Error finding container f169bf99363695cf413ae8cad1cfa714cecc8119d8b91f107ec377f291310bc0: Status 404 returned error can't find the container with id f169bf99363695cf413ae8cad1cfa714cecc8119d8b91f107ec377f291310bc0 Apr 21 15:22:28.497522 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:28.497505 2544 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:22:29.447267 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:22:29.447227 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" event={"ID":"931ec68b-f510-47d2-937a-72212ea680cb","Type":"ContainerStarted","Data":"f169bf99363695cf413ae8cad1cfa714cecc8119d8b91f107ec377f291310bc0"} Apr 21 15:24:22.816891 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:22.816855 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" event={"ID":"931ec68b-f510-47d2-937a-72212ea680cb","Type":"ContainerStarted","Data":"6041d27fa8b1d82d5c0fb1c999de7cd248f5fb86caa05791d4b16aeba6568b09"} Apr 21 15:24:22.817316 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:22.816908 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:24:22.841802 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:22.841728 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" podStartSLOduration=1.5220807710000002 podStartE2EDuration="1m54.841712701s" podCreationTimestamp="2026-04-21 15:22:28 +0000 UTC" firstStartedPulling="2026-04-21 15:22:28.497687517 +0000 UTC m=+712.904095755" lastFinishedPulling="2026-04-21 15:24:21.817319436 +0000 UTC m=+826.223727685" observedRunningTime="2026-04-21 15:24:22.840664303 +0000 UTC m=+827.247072562" watchObservedRunningTime="2026-04-21 15:24:22.841712701 +0000 UTC m=+827.248120961" Apr 21 15:24:23.819910 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:23.819877 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:24:45.110734 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:45.110688 2544 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" podUID="931ec68b-f510-47d2-937a-72212ea680cb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": read tcp 10.134.0.2:37864->10.134.0.23:28080: read: connection reset by peer" Apr 21 15:24:45.818002 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:45.817956 2544 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" podUID="931ec68b-f510-47d2-937a-72212ea680cb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": dial tcp 10.134.0.23:28080: connect: connection refused" Apr 21 15:24:45.818189 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:45.818067 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:24:45.818651 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:45.818617 2544 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" podUID="931ec68b-f510-47d2-937a-72212ea680cb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": dial tcp 10.134.0.23:28080: connect: connection refused" Apr 21 15:24:45.885564 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:45.885525 2544 generic.go:358] "Generic (PLEG): container finished" podID="931ec68b-f510-47d2-937a-72212ea680cb" containerID="6041d27fa8b1d82d5c0fb1c999de7cd248f5fb86caa05791d4b16aeba6568b09" exitCode=0 Apr 21 15:24:45.885735 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:45.885582 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" event={"ID":"931ec68b-f510-47d2-937a-72212ea680cb","Type":"ContainerDied","Data":"6041d27fa8b1d82d5c0fb1c999de7cd248f5fb86caa05791d4b16aeba6568b09"} Apr 21 15:24:47.023994 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.023967 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:24:47.038260 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.038221 2544 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svnkc\" (UniqueName: \"kubernetes.io/projected/931ec68b-f510-47d2-937a-72212ea680cb-kube-api-access-svnkc\") pod \"931ec68b-f510-47d2-937a-72212ea680cb\" (UID: \"931ec68b-f510-47d2-937a-72212ea680cb\") " Apr 21 15:24:47.040369 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.040338 2544 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931ec68b-f510-47d2-937a-72212ea680cb-kube-api-access-svnkc" (OuterVolumeSpecName: "kube-api-access-svnkc") pod "931ec68b-f510-47d2-937a-72212ea680cb" (UID: "931ec68b-f510-47d2-937a-72212ea680cb"). InnerVolumeSpecName "kube-api-access-svnkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:24:47.138796 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.138729 2544 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svnkc\" (UniqueName: \"kubernetes.io/projected/931ec68b-f510-47d2-937a-72212ea680cb-kube-api-access-svnkc\") on node \"ip-10-0-137-168.ec2.internal\" DevicePath \"\"" Apr 21 15:24:47.893188 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.893160 2544 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" Apr 21 15:24:47.893369 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.893154 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww" event={"ID":"931ec68b-f510-47d2-937a-72212ea680cb","Type":"ContainerDied","Data":"f169bf99363695cf413ae8cad1cfa714cecc8119d8b91f107ec377f291310bc0"} Apr 21 15:24:47.893369 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:47.893266 2544 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f169bf99363695cf413ae8cad1cfa714cecc8119d8b91f107ec377f291310bc0" Apr 21 15:24:51.595456 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:51.595421 2544 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww"] Apr 21 15:24:51.599245 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:51.599220 2544 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-92tmr/progression-custom-config-node-0-0-6wvww"] Apr 21 15:24:52.221443 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:24:52.221403 2544 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931ec68b-f510-47d2-937a-72212ea680cb" path="/var/lib/kubelet/pods/931ec68b-f510-47d2-937a-72212ea680cb/volumes" Apr 21 15:25:01.514939 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:01.514898 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-6f49596498-xxpmg_6228cad9-195b-4e32-8aa6-b6a4779ea6ee/manager/0.log" Apr 21 15:25:02.084064 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:02.084027 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-6f49596498-xxpmg_6228cad9-195b-4e32-8aa6-b6a4779ea6ee/manager/0.log" Apr 21 15:25:02.563974 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:02.563938 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-6f49596498-xxpmg_6228cad9-195b-4e32-8aa6-b6a4779ea6ee/manager/0.log" Apr 21 15:25:36.132826 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:36.132722 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:25:36.135954 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:36.135927 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:25:43.795244 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:43.795210 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vvx6b_800736cd-155f-42b3-abfc-1ca602be5ccf/global-pull-secret-syncer/0.log" Apr 21 15:25:43.866967 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:43.866928 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5s2h5_67d06525-c590-4bd2-8237-2f7fa8b1d779/konnectivity-agent/0.log" Apr 21 15:25:43.945347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:43.945312 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-168.ec2.internal_d8e31eb06d4e26e4e1d995160527f9b1/haproxy/0.log" Apr 21 15:25:47.331466 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.331429 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cjh5t_576f8157-2364-4155-8a62-633a5c205ce8/monitoring-plugin/0.log" Apr 21 15:25:47.561466 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.561422 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xcldm_2b8e98d1-29a2-44eb-9981-0c9d473a51aa/node-exporter/0.log" Apr 21 15:25:47.582403 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.582328 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xcldm_2b8e98d1-29a2-44eb-9981-0c9d473a51aa/kube-rbac-proxy/0.log" Apr 21 15:25:47.607431 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.607405 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xcldm_2b8e98d1-29a2-44eb-9981-0c9d473a51aa/init-textfile/0.log" Apr 21 15:25:47.734700 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.734661 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/prometheus/0.log" Apr 21 15:25:47.765409 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.765381 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/config-reloader/0.log" Apr 21 15:25:47.792418 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.792391 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/thanos-sidecar/0.log" Apr 21 15:25:47.817150 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.817118 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/kube-rbac-proxy-web/0.log" Apr 21 15:25:47.844192 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.844106 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/kube-rbac-proxy/0.log" Apr 21 15:25:47.872137 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.872106 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/kube-rbac-proxy-thanos/0.log" Apr 21 15:25:47.902211 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:47.902179 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dba9102-9c62-4c6e-92db-d441db822372/init-config-reloader/0.log" Apr 21 15:25:48.074923 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.074897 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-fcdccbb55-hzcjs_347281ad-1022-42b2-81fb-6af5bbb2235a/telemeter-client/0.log" Apr 21 15:25:48.103656 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.103572 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-fcdccbb55-hzcjs_347281ad-1022-42b2-81fb-6af5bbb2235a/reload/0.log" Apr 21 15:25:48.136600 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.136570 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-fcdccbb55-hzcjs_347281ad-1022-42b2-81fb-6af5bbb2235a/kube-rbac-proxy/0.log" Apr 21 15:25:48.203460 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.203432 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-745cb96956-h6vct_99fb4416-714e-46d0-81ca-059cc9d4c30a/thanos-query/0.log" Apr 21 15:25:48.228793 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.228764 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-745cb96956-h6vct_99fb4416-714e-46d0-81ca-059cc9d4c30a/kube-rbac-proxy-web/0.log" Apr 21 15:25:48.261496 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.261458 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-745cb96956-h6vct_99fb4416-714e-46d0-81ca-059cc9d4c30a/kube-rbac-proxy/0.log" Apr 21 15:25:48.291642 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.291610 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-745cb96956-h6vct_99fb4416-714e-46d0-81ca-059cc9d4c30a/prom-label-proxy/0.log" Apr 21 15:25:48.325564 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.325513 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-745cb96956-h6vct_99fb4416-714e-46d0-81ca-059cc9d4c30a/kube-rbac-proxy-rules/0.log" Apr 21 15:25:48.358726 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:48.358645 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-745cb96956-h6vct_99fb4416-714e-46d0-81ca-059cc9d4c30a/kube-rbac-proxy-metrics/0.log" Apr 21 15:25:50.115657 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.115622 2544 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl"] Apr 21 15:25:50.116079 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.115952 2544 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="931ec68b-f510-47d2-937a-72212ea680cb" containerName="node" Apr 21 15:25:50.116079 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.115963 2544 state_mem.go:107] "Deleted CPUSet assignment" podUID="931ec68b-f510-47d2-937a-72212ea680cb" containerName="node" Apr 21 15:25:50.116079 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.116015 2544 memory_manager.go:356] "RemoveStaleState removing state" podUID="931ec68b-f510-47d2-937a-72212ea680cb" containerName="node" Apr 21 15:25:50.119085 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.119067 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.123433 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.123405 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p2xz6\"/\"kube-root-ca.crt\"" Apr 21 15:25:50.123579 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.123501 2544 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p2xz6\"/\"openshift-service-ca.crt\"" Apr 21 15:25:50.123579 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.123562 2544 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p2xz6\"/\"default-dockercfg-flhhr\"" Apr 21 15:25:50.127448 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.127425 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl"] Apr 21 15:25:50.164000 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.163963 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-lib-modules\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.164172 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.164012 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-proc\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.164172 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.164078 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-podres\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.164172 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.164105 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhvk\" (UniqueName: \"kubernetes.io/projected/10075dbf-3bcc-4c30-9f25-f86577a6fce3-kube-api-access-pzhvk\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.164172 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.164134 2544 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-sys\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265368 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265320 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhvk\" (UniqueName: \"kubernetes.io/projected/10075dbf-3bcc-4c30-9f25-f86577a6fce3-kube-api-access-pzhvk\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265368 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265374 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-sys\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265435 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-lib-modules\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265472 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-proc\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265513 2544 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-podres\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265518 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-sys\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265599 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-proc\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265627 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265599 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-lib-modules\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.265892 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.265636 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/10075dbf-3bcc-4c30-9f25-f86577a6fce3-podres\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.274501 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.274462 2544 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhvk\" (UniqueName: \"kubernetes.io/projected/10075dbf-3bcc-4c30-9f25-f86577a6fce3-kube-api-access-pzhvk\") pod \"perf-node-gather-daemonset-qrfwl\" (UID: \"10075dbf-3bcc-4c30-9f25-f86577a6fce3\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.431015 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.430920 2544 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:50.584345 ip-10-0-137-168 kubenswrapper[2544]: W0421 15:25:50.584310 2544 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod10075dbf_3bcc_4c30_9f25_f86577a6fce3.slice/crio-c887c85d516a944bf9375e0d849e695bb5607cbfb3948e71441bef499b8a2a25 WatchSource:0}: Error finding container c887c85d516a944bf9375e0d849e695bb5607cbfb3948e71441bef499b8a2a25: Status 404 returned error can't find the container with id c887c85d516a944bf9375e0d849e695bb5607cbfb3948e71441bef499b8a2a25 Apr 21 15:25:50.588356 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:50.588328 2544 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl"] Apr 21 15:25:51.084250 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.084211 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" event={"ID":"10075dbf-3bcc-4c30-9f25-f86577a6fce3","Type":"ContainerStarted","Data":"df5312bc501b63bb593eee80e33378bc363b3b793defbc998c5ab809115b8b64"} Apr 21 15:25:51.084404 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.084257 2544 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" event={"ID":"10075dbf-3bcc-4c30-9f25-f86577a6fce3","Type":"ContainerStarted","Data":"c887c85d516a944bf9375e0d849e695bb5607cbfb3948e71441bef499b8a2a25"} Apr 21 15:25:51.084404 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.084296 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:51.102901 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.102836 2544 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" podStartSLOduration=1.102819348 podStartE2EDuration="1.102819348s" podCreationTimestamp="2026-04-21 15:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:25:51.101198084 +0000 UTC m=+915.507606345" watchObservedRunningTime="2026-04-21 15:25:51.102819348 +0000 UTC m=+915.509227607" Apr 21 15:25:51.332353 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.332323 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dwdcj_18eaf2c1-6533-4b15-a759-c0e039abbd8f/dns/0.log" Apr 21 15:25:51.354287 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.354209 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dwdcj_18eaf2c1-6533-4b15-a759-c0e039abbd8f/kube-rbac-proxy/0.log" Apr 21 15:25:51.520542 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.520499 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tbnxq_c5594cf9-4b09-4077-9a0a-c6e1e4145792/dns-node-resolver/0.log" Apr 21 15:25:51.935292 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:51.935260 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9lsjs_c23d088b-b54c-4874-aac7-248c3a09117a/node-ca/0.log" Apr 21 15:25:52.614764 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:52.614718 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79765d5944-b66kf_75433e2f-ffe1-4177-857f-37e16ba4a802/router/0.log" Apr 21 15:25:52.983594 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:52.983523 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q5k4h_a7f8e763-6577-4e5f-bc6c-5d7ac9e2c059/serve-healthcheck-canary/0.log" Apr 21 15:25:53.361787 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:53.361742 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-n8zqd_f2e49b4d-d05c-4693-9ac9-190627c56f55/insights-operator/1.log" Apr 21 15:25:53.372061 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:53.372025 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-n8zqd_f2e49b4d-d05c-4693-9ac9-190627c56f55/insights-operator/0.log" Apr 21 15:25:53.556548 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:53.556519 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-98544_5e616a4d-2a3a-42c3-be82-9a795c5fa152/kube-rbac-proxy/0.log" Apr 21 15:25:53.576890 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:53.576861 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-98544_5e616a4d-2a3a-42c3-be82-9a795c5fa152/exporter/0.log" Apr 21 15:25:53.596795 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:53.596747 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-98544_5e616a4d-2a3a-42c3-be82-9a795c5fa152/extractor/0.log" Apr 21 15:25:57.097650 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:57.097623 2544 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-qrfwl" Apr 21 15:25:58.502535 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:58.502503 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-96kdf_d450b0ff-67e3-4fe9-946f-9320d97a5efd/migrator/0.log" Apr 21 15:25:58.523201 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:25:58.523170 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-96kdf_d450b0ff-67e3-4fe9-946f-9320d97a5efd/graceful-termination/0.log" Apr 21 15:26:00.347429 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.347399 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/kube-multus-additional-cni-plugins/0.log" Apr 21 15:26:00.379358 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.379327 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/egress-router-binary-copy/0.log" Apr 21 15:26:00.404638 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.404556 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/cni-plugins/0.log" Apr 21 15:26:00.426736 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.426708 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/bond-cni-plugin/0.log" Apr 21 15:26:00.458047 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.458020 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/routeoverride-cni/0.log" Apr 21 15:26:00.483738 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.483714 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/whereabouts-cni-bincopy/0.log" Apr 21 15:26:00.507061 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.507023 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8gk4_49309d1e-9f62-4e92-8114-acfac3171dc5/whereabouts-cni/0.log" Apr 21 15:26:00.613437 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.613407 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rwxvz_dbf1eb34-c452-44b9-b6b3-dd0fe6ea5c9b/kube-multus/0.log" Apr 21 15:26:00.642825 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.642795 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-84hkv_3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428/network-metrics-daemon/0.log" Apr 21 15:26:00.677958 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:00.677873 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-84hkv_3ed47f4b-eaf7-4ffc-8127-1dcfeabe3428/kube-rbac-proxy/0.log" Apr 21 15:26:01.898046 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:01.898010 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-controller/0.log" Apr 21 15:26:01.915599 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:01.915569 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/0.log" Apr 21 15:26:01.924030 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:01.923997 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovn-acl-logging/1.log" Apr 21 15:26:01.952261 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:01.952228 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/kube-rbac-proxy-node/0.log" Apr 21 15:26:01.973581 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:01.973542 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 15:26:01.991479 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:01.991431 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/northd/0.log" Apr 21 15:26:02.015106 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:02.015070 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/nbdb/0.log" Apr 21 15:26:02.036347 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:02.036307 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/sbdb/0.log" Apr 21 15:26:02.207892 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:02.207796 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hj57q_d048a9ec-b8d0-42a8-9384-5fe347a8873b/ovnkube-controller/0.log" Apr 21 15:26:03.595964 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:03.595932 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lnq2b_a5a141dd-cdcf-4c40-ba20-236924041f30/check-endpoints/0.log" Apr 21 15:26:03.669366 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:03.669335 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sm5q9_0bc3eca4-5765-4b43-a4f4-51b55c9f8d88/network-check-target-container/0.log" Apr 21 15:26:04.564961 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:04.564932 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hmf4l_013ca451-549d-478c-941e-0e9994b24c34/iptables-alerter/0.log" Apr 21 15:26:05.431703 ip-10-0-137-168 kubenswrapper[2544]: I0421 15:26:05.431671 2544 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nv5tm_8a30a0d0-d747-4359-a316-b1d4215f71f3/tuned/0.log"