Apr 21 16:01:48.119069 ip-10-0-142-158 systemd[1]: Starting Kubernetes Kubelet... Apr 21 16:01:48.636794 ip-10-0-142-158 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:48.636794 ip-10-0-142-158 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 16:01:48.636794 ip-10-0-142-158 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:48.636794 ip-10-0-142-158 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 16:01:48.636794 ip-10-0-142-158 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:48.637919 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.637730 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 16:01:48.644052 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644030 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:48.644052 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644046 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:48.644052 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644051 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:48.644052 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644054 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:48.644052 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644057 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:48.644052 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644060 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644063 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644066 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644069 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644071 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644074 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644077 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644079 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644082 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644085 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644088 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644090 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644093 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644096 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644099 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644101 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644104 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644107 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644109 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644112 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:48.644266 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644115 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644117 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644120 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644122 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644125 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644134 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644137 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644140 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644143 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644145 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644148 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644150 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644153 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644155 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644159 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644162 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644165 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644167 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644170 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:48.644750 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644173 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644175 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644178 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644180 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644183 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644186 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644189 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644191 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644194 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644197 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644199 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644202 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644206 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644208 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644211 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644214 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644216 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644219 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644222 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644225 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:48.645251 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644229 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644234 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644237 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644240 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644243 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644246 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644249 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644251 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644254 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644257 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644260 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644263 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644266 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644268 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644273 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644276 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644279 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644281 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644284 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644287 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:48.645865 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644289 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644292 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644692 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644698 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644701 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644704 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644707 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644710 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644713 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644716 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644719 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644723 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644726 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644729 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644732 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644736 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644739 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644742 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644744 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644747 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:48.646366 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644750 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644753 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644755 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644758 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644761 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644764 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644767 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644769 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644772 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644774 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644777 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644795 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644798 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644801 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644804 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644807 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644809 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644812 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644814 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644817 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:48.646866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644820 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644822 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644825 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644829 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644831 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644834 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644837 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644841 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644844 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644847 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644850 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644852 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644855 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644858 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644860 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644863 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644866 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644868 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644871 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644874 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:48.647368 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644877 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644879 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644882 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644885 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644887 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644891 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644895 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644899 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644902 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644906 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644909 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644912 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644914 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644917 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644919 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644922 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644924 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644927 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:48.647891 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644930 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644933 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644935 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644938 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644941 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644943 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644946 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644948 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644951 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.644953 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645026 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645034 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645043 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645049 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645057 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645073 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645082 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645086 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645090 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645093 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645097 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645100 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 16:01:48.648340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645103 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645106 2562 flags.go:64] FLAG: --cgroup-root="" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645108 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645111 2562 flags.go:64] FLAG: --client-ca-file="" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645114 2562 flags.go:64] FLAG: --cloud-config="" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645117 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645120 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645124 2562 flags.go:64] FLAG: --cluster-domain="" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645127 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645130 2562 flags.go:64] FLAG: --config-dir="" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645133 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645136 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645140 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645143 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645147 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645150 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645153 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645156 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645159 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645162 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645165 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645170 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645173 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645176 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645179 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 16:01:48.648922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645183 2562 flags.go:64] FLAG: --enable-server="true" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645186 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645190 2562 flags.go:64] FLAG: --event-burst="100" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645193 2562 flags.go:64] FLAG: --event-qps="50" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645197 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645200 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645203 2562 flags.go:64] FLAG: --eviction-hard="" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645206 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645209 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645212 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645215 2562 flags.go:64] FLAG: --eviction-soft="" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645218 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645222 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645224 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645227 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645230 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645233 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645236 2562 flags.go:64] FLAG: --feature-gates="" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645240 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645243 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645247 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645250 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645253 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645256 2562 flags.go:64] FLAG: --help="false" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645260 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-142-158.ec2.internal" Apr 21 16:01:48.649538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645263 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645267 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645269 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645273 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645276 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645279 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645282 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645285 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645289 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645292 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645295 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645298 2562 flags.go:64] FLAG: --kube-reserved="" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645301 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645303 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645306 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645309 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645312 2562 flags.go:64] FLAG: --lock-file="" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645316 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645318 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645322 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645327 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645330 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645333 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 16:01:48.650184 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645335 2562 flags.go:64] FLAG: --logging-format="text" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645338 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645342 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645344 2562 flags.go:64] FLAG: --manifest-url="" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645347 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645352 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645355 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645359 2562 flags.go:64] FLAG: --max-pods="110" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645362 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645365 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645368 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645371 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645374 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645377 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645382 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645389 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645393 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645396 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645399 2562 flags.go:64] FLAG: --pod-cidr="" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645402 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645408 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645411 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645414 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645417 2562 flags.go:64] FLAG: --port="10250" Apr 21 16:01:48.650750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645420 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645423 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0165706fc70b45b2b" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645429 2562 flags.go:64] FLAG: --qos-reserved="" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645432 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645435 2562 flags.go:64] FLAG: --register-node="true" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645438 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645441 2562 flags.go:64] FLAG: --register-with-taints="" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645445 2562 flags.go:64] FLAG: --registry-burst="10" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645448 2562 flags.go:64] FLAG: --registry-qps="5" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645451 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645454 2562 flags.go:64] FLAG: --reserved-memory="" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645457 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645461 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645464 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645466 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645470 2562 flags.go:64] FLAG: --runonce="false" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645473 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645475 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645478 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645482 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645484 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645488 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645492 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645495 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645498 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645501 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 16:01:48.651386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645504 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645508 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645512 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645515 2562 flags.go:64] FLAG: --system-cgroups="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645518 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645523 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645526 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645529 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645534 2562 flags.go:64] FLAG: --tls-min-version="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645537 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645540 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645543 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645546 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645549 2562 flags.go:64] FLAG: --v="2" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645553 2562 flags.go:64] FLAG: --version="false" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645557 2562 flags.go:64] FLAG: --vmodule="" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645562 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.645565 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645660 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645663 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645668 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645671 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645674 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:48.652041 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645677 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645679 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645682 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645685 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645688 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645692 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645694 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645697 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645700 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645702 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645705 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645708 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645711 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645713 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645716 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645719 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645721 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645725 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645728 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645730 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:48.652968 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645733 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645736 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645738 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645741 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645743 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645746 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645748 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645751 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645754 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645756 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645759 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645761 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645764 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645766 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645769 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645771 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645774 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645778 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645797 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645799 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:48.653861 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645802 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645805 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645808 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645810 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645813 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645816 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645819 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645821 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645824 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645827 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645830 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645833 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645835 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645838 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645840 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645843 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645845 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645848 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645851 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645853 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:48.654727 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645856 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645858 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645861 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645864 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645866 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645869 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645872 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645874 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645877 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645881 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645883 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645886 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645890 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645894 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645897 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645899 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645902 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645905 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645908 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:48.655524 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645912 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.645916 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.646840 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.654155 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.654176 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654247 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654255 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654260 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654265 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654270 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654275 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654280 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654284 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654288 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654293 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:48.656150 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654297 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654304 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654310 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654315 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654321 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654326 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654331 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654336 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654340 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654344 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654348 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654352 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654357 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654361 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654365 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654371 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654376 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654381 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654385 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654389 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:48.656672 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654394 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654399 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654403 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654408 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654412 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654417 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654421 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654425 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654430 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654436 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654442 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654447 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654451 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654455 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654459 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654463 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654468 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654472 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654475 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:48.657538 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654480 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654484 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654488 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654492 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654496 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654500 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654505 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654509 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654513 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654518 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654522 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654526 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654530 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654534 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654539 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654543 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654547 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654551 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654555 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654560 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:48.658224 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654564 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654568 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654572 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654577 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654581 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654585 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654589 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654593 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654597 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654601 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654605 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654609 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654613 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654617 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654621 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654625 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:48.658895 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654630 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.654638 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654808 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654816 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654821 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654825 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654831 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654836 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654841 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654845 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654850 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654855 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654859 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654864 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654868 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654873 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:48.659306 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654878 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654884 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654888 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654893 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654897 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654902 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654905 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654910 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654914 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654918 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654923 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654927 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654931 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654936 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654940 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654943 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654947 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654952 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654957 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654961 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:48.659720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654966 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654970 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654974 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654979 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654983 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654987 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654991 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.654996 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655000 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655004 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655008 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655013 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655017 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655021 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655025 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655029 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655033 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655037 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655042 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655046 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:48.660330 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655050 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655054 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655058 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655063 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655067 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655071 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655077 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655083 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655088 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655092 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655097 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655102 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655107 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655111 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655116 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655120 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655125 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655130 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:48.660859 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655135 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655139 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655143 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655149 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655154 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655159 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655163 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655167 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655171 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655176 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655180 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655184 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655188 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:48.655193 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.655201 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:48.661350 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.655942 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 16:01:48.661725 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.658495 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 16:01:48.661725 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.659777 2562 server.go:1019] "Starting client certificate rotation" Apr 21 16:01:48.661725 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.659902 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 16:01:48.661725 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.659947 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 16:01:48.692130 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.692111 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 16:01:48.694881 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.694860 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 16:01:48.710971 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.710947 2562 log.go:25] "Validated CRI v1 runtime API" Apr 21 16:01:48.717298 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.717282 2562 log.go:25] "Validated CRI v1 image API" Apr 21 16:01:48.720432 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.720416 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 16:01:48.725618 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.725598 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 16:01:48.725697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.725676 2562 fs.go:135] Filesystem UUIDs: map[36139c76-2601-46a0-81d3-907cee5b675a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f88d9f12-4daa-48f0-8a84-9ad0b8cd5ee3:/dev/nvme0n1p3] Apr 21 16:01:48.725735 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.725696 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 16:01:48.732491 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.732376 2562 manager.go:217] Machine: {Timestamp:2026-04-21 16:01:48.729428487 +0000 UTC m=+0.474366283 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096278 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e210b4adc3ac5ab8fe27a007a0349 SystemUUID:ec2e210b-4adc-3ac5-ab8f-e27a007a0349 BootID:edc0d29a-d43a-4061-8e72-dbaab3fd973a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3a:30:e4:1c:e7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3a:30:e4:1c:e7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:62:53:d5:d5:9f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 16:01:48.732491 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.732479 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 16:01:48.732614 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.732587 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 16:01:48.733807 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.733757 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 16:01:48.733974 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.733836 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 16:01:48.734021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.733983 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 16:01:48.734021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.733992 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 16:01:48.734021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.734004 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 16:01:48.734914 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.734904 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 16:01:48.735739 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.735731 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 21 16:01:48.735988 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.735977 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 16:01:48.739128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.739119 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 21 16:01:48.739164 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.739139 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 16:01:48.739164 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.739151 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 16:01:48.739164 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.739160 2562 kubelet.go:397] "Adding apiserver pod source" Apr 21 16:01:48.739252 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.739170 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 16:01:48.740472 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.740459 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 16:01:48.740528 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.740477 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 16:01:48.743915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.743899 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 16:01:48.745174 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.745158 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 16:01:48.748698 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748678 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 16:01:48.748762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748708 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 16:01:48.748762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748721 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 16:01:48.748762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748733 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 16:01:48.748762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748746 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 16:01:48.748762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748759 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 16:01:48.748915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748771 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 16:01:48.748915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748797 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 16:01:48.748915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748811 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 16:01:48.748915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748824 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 16:01:48.748915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748843 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 16:01:48.748915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.748862 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 16:01:48.749873 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.749863 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 16:01:48.749873 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.749873 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 16:01:48.752024 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.751976 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 16:01:48.752082 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.752053 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 16:01:48.753386 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.753373 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 16:01:48.753437 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.753409 2562 server.go:1295] "Started kubelet" Apr 21 16:01:48.754300 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.754118 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 16:01:48.754372 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.754327 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 16:01:48.754405 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.754138 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 16:01:48.754433 ip-10-0-142-158 systemd[1]: Started Kubernetes Kubelet. Apr 21 16:01:48.755923 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.755907 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 16:01:48.758545 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.758514 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 21 16:01:48.762330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.762311 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 16:01:48.763455 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.763301 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 16:01:48.763669 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.762278 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-158.ec2.internal.18a86aa4bbaaf7a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-158.ec2.internal,UID:ip-10-0-142-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-158.ec2.internal,},FirstTimestamp:2026-04-21 16:01:48.753385384 +0000 UTC m=+0.498323181,LastTimestamp:2026-04-21 16:01:48.753385384 +0000 UTC m=+0.498323181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-158.ec2.internal,}" Apr 21 16:01:48.763911 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.763893 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 16:01:48.764954 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.764929 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 16:01:48.765048 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.764935 2562 factory.go:55] Registering systemd factory Apr 21 16:01:48.765048 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.764985 2562 factory.go:223] Registration of the systemd container factory successfully Apr 21 16:01:48.765048 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.764932 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 16:01:48.765048 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765047 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 16:01:48.765243 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765170 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 21 16:01:48.765243 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765196 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 21 16:01:48.765340 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.765275 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:48.765413 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765392 2562 factory.go:153] Registering CRI-O factory Apr 21 16:01:48.765452 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765415 2562 factory.go:223] Registration of the crio container factory successfully Apr 21 16:01:48.765490 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765468 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 16:01:48.765490 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765487 2562 factory.go:103] Registering Raw factory Apr 21 16:01:48.765561 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765500 2562 manager.go:1196] Started watching for new ooms in manager Apr 21 16:01:48.765903 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.765890 2562 manager.go:319] Starting recovery of all containers Apr 21 16:01:48.766317 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.766290 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 16:01:48.770132 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.770107 2562 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 16:01:48.770232 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.770199 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 16:01:48.772514 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.772485 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9c4vs" Apr 21 16:01:48.775689 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.775674 2562 manager.go:324] Recovery completed Apr 21 16:01:48.780862 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.780850 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:48.782240 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.782225 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9c4vs" Apr 21 16:01:48.783345 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.783331 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:48.783423 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.783360 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:48.783423 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.783374 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:48.783976 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.783918 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 16:01:48.783976 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.783928 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 16:01:48.783976 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.783957 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 21 16:01:48.785490 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.785418 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-158.ec2.internal.18a86aa4bd741a88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-158.ec2.internal,UID:ip-10-0-142-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-158.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-158.ec2.internal,},FirstTimestamp:2026-04-21 16:01:48.783344264 +0000 UTC m=+0.528282065,LastTimestamp:2026-04-21 16:01:48.783344264 +0000 UTC m=+0.528282065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-158.ec2.internal,}" Apr 21 16:01:48.785985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.785972 2562 policy_none.go:49] "None policy: Start" Apr 21 16:01:48.786023 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.786007 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 16:01:48.786023 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.786018 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 21 16:01:48.828368 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828332 2562 manager.go:341] "Starting Device Plugin manager" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.828407 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828422 2562 server.go:85] "Starting device plugin registration server" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828692 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828703 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828821 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828892 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.828901 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.829287 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 16:01:48.837982 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.829317 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:48.904598 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.904546 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 16:01:48.905705 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.905690 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 16:01:48.905776 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.905718 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 16:01:48.905776 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.905737 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 16:01:48.905776 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.905744 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 16:01:48.905903 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.905796 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 16:01:48.910277 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.910257 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:48.929657 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.929641 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:48.930633 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.930619 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:48.930708 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.930644 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:48.930708 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.930657 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:48.930708 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.930680 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-158.ec2.internal" Apr 21 16:01:48.939822 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:48.939809 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-158.ec2.internal" Apr 21 16:01:48.939857 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.939829 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-158.ec2.internal\": node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:48.957030 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:48.957013 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.006257 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.006238 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal"] Apr 21 16:01:49.006326 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.006298 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:49.007042 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.007029 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:49.007102 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.007053 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:49.007102 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.007062 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:49.008303 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008291 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:49.008452 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008440 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.008487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008471 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:49.008955 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008938 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:49.009034 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008971 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:49.009034 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008981 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:49.009034 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.008942 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:49.009129 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.009040 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:49.009129 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.009050 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:49.010042 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.010025 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.010121 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.010059 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:49.010659 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.010644 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:49.010702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.010674 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:49.010702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.010685 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:49.037334 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.037315 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-158.ec2.internal\" not found" node="ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.041467 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.041451 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-158.ec2.internal\" not found" node="ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.057184 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.057169 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.066461 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.066442 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/731e093ceab326bbc076053ea8678ffb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal\" (UID: \"731e093ceab326bbc076053ea8678ffb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.066538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.066473 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/731e093ceab326bbc076053ea8678ffb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal\" (UID: \"731e093ceab326bbc076053ea8678ffb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.066538 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.066509 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ee115be6bbf3231206ae6c74733c2779-config\") pod \"kube-apiserver-proxy-ip-10-0-142-158.ec2.internal\" (UID: \"ee115be6bbf3231206ae6c74733c2779\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.158272 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.158234 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.167266 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.167247 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/731e093ceab326bbc076053ea8678ffb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal\" (UID: \"731e093ceab326bbc076053ea8678ffb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.167328 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.167271 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/731e093ceab326bbc076053ea8678ffb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal\" (UID: \"731e093ceab326bbc076053ea8678ffb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.167328 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.167289 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ee115be6bbf3231206ae6c74733c2779-config\") pod \"kube-apiserver-proxy-ip-10-0-142-158.ec2.internal\" (UID: \"ee115be6bbf3231206ae6c74733c2779\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.167393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.167328 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ee115be6bbf3231206ae6c74733c2779-config\") pod \"kube-apiserver-proxy-ip-10-0-142-158.ec2.internal\" (UID: \"ee115be6bbf3231206ae6c74733c2779\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.167393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.167353 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/731e093ceab326bbc076053ea8678ffb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal\" (UID: \"731e093ceab326bbc076053ea8678ffb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.167393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.167337 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/731e093ceab326bbc076053ea8678ffb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal\" (UID: \"731e093ceab326bbc076053ea8678ffb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.258641 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.258620 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.339153 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.339137 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.344602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.344585 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" Apr 21 16:01:49.359480 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.359458 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.460018 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.459929 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.560441 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.560402 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.659013 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.658992 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 16:01:49.659597 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.659126 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 16:01:49.661153 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.661132 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.761925 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.761903 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.763423 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.763404 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 16:01:49.784946 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.784912 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:56:48 +0000 UTC" deadline="2027-10-18 01:21:55.033020047 +0000 UTC" Apr 21 16:01:49.784946 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.784945 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13065h20m5.248078998s" Apr 21 16:01:49.785965 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.785946 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 16:01:49.819833 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.819811 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v97sk" Apr 21 16:01:49.827825 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.827807 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v97sk" Apr 21 16:01:49.852709 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:49.852685 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731e093ceab326bbc076053ea8678ffb.slice/crio-b333e4ac6d9ec813377182b74244bf87f99c25e6279d15ce1e2661f9070fb0e7 WatchSource:0}: Error finding container b333e4ac6d9ec813377182b74244bf87f99c25e6279d15ce1e2661f9070fb0e7: Status 404 returned error can't find the container with id b333e4ac6d9ec813377182b74244bf87f99c25e6279d15ce1e2661f9070fb0e7 Apr 21 16:01:49.853177 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:49.853155 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee115be6bbf3231206ae6c74733c2779.slice/crio-821aa4011fc9370f61b240bcd8b1828ed9a7b9b5cfcc79624d65964e22e4c9a1 WatchSource:0}: Error finding container 821aa4011fc9370f61b240bcd8b1828ed9a7b9b5cfcc79624d65964e22e4c9a1: Status 404 returned error can't find the container with id 821aa4011fc9370f61b240bcd8b1828ed9a7b9b5cfcc79624d65964e22e4c9a1 Apr 21 16:01:49.860754 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.860737 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:01:49.862307 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.862287 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:49.908124 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.908079 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" event={"ID":"731e093ceab326bbc076053ea8678ffb","Type":"ContainerStarted","Data":"b333e4ac6d9ec813377182b74244bf87f99c25e6279d15ce1e2661f9070fb0e7"} Apr 21 16:01:49.908904 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.908880 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" event={"ID":"ee115be6bbf3231206ae6c74733c2779","Type":"ContainerStarted","Data":"821aa4011fc9370f61b240bcd8b1828ed9a7b9b5cfcc79624d65964e22e4c9a1"} Apr 21 16:01:49.947246 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.947233 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:49.960277 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:49.960262 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:49.963325 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:49.963309 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:50.063834 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.063770 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:50.164240 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.164214 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-158.ec2.internal\" not found" Apr 21 16:01:50.168950 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.168935 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:50.264728 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.264697 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" Apr 21 16:01:50.275680 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.275655 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 16:01:50.276892 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.276867 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" Apr 21 16:01:50.285679 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.285655 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 16:01:50.605926 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.605895 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:50.740581 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.740544 2562 apiserver.go:52] "Watching apiserver" Apr 21 16:01:50.746638 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.746617 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 16:01:50.747077 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.747048 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-chrww","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal","openshift-multus/multus-additional-cni-plugins-bskpq","openshift-network-diagnostics/network-check-target-nnb5j","kube-system/konnectivity-agent-7n47b","openshift-multus/multus-fm4kz","openshift-multus/network-metrics-daemon-rg8v9","openshift-network-operator/iptables-alerter-qsmcw","openshift-ovn-kubernetes/ovnkube-node-np85v","kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8","openshift-cluster-node-tuning-operator/tuned-c5kmx","openshift-dns/node-resolver-4dmnb"] Apr 21 16:01:50.748588 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.748566 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.751804 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.751769 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.751943 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.751899 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.753240 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.753219 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.753808 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.753698 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 16:01:50.753808 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.753754 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 16:01:50.754034 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.753908 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.754034 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.753955 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-txncd\"" Apr 21 16:01:50.754657 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.754641 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:50.754778 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.754720 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:01:50.756003 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.755983 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:50.756092 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.756046 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:01:50.757347 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.757316 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.758232 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.757971 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.758423 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758403 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.758495 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758438 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 16:01:50.758723 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758600 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 16:01:50.758723 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758653 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.758723 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758659 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.758933 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758913 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.759279 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.758985 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zpz2n\"" Apr 21 16:01:50.759279 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.759048 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sknb7\"" Apr 21 16:01:50.759279 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.759191 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x5zds\"" Apr 21 16:01:50.759279 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.759268 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.759732 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.759712 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 16:01:50.759843 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.759712 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8t4mh\"" Apr 21 16:01:50.760056 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.760038 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 16:01:50.760963 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.760918 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.761440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.761390 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.761847 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.761829 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lfvlh\"" Apr 21 16:01:50.761947 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.761877 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 16:01:50.762236 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.762217 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.762236 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.762229 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.764167 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764149 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 16:01:50.764653 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764633 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.764754 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764643 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2zw7l\"" Apr 21 16:01:50.764831 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764768 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 16:01:50.764899 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764878 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.764983 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764903 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.764983 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.764979 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 16:01:50.765089 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.765030 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 16:01:50.765140 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.765097 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.765140 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.765129 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 16:01:50.765570 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.765334 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fn8md\"" Apr 21 16:01:50.765650 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.765428 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.768212 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.768180 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 16:01:50.768212 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.768183 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 16:01:50.768406 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.768199 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 16:01:50.768406 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.768331 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-grrhp\"" Apr 21 16:01:50.775350 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775333 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-log-socket\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-lib-modules\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.775440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775373 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.775440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775388 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ndd\" (UniqueName: \"kubernetes.io/projected/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-kube-api-access-g4ndd\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.775440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775409 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-ovnkube-config\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775432 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-env-overrides\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775477 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1640f00-dda4-4761-acce-37205e686361-host-slash\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775518 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-kubelet\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775566 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-run-ovn-kubernetes\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775593 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-cni-binary-copy\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775617 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775638 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-etc-selinux\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775661 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-ovn\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775684 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-cni-netd\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.775724 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775710 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcdpm\" (UniqueName: \"kubernetes.io/projected/36e9aa61-0f27-4d2c-abce-685977a97e00-kube-api-access-mcdpm\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775743 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysconfig\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775773 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb6dd680-d8be-4220-b690-a82c23fa355f-tmp\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775814 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c52kt\" (UniqueName: \"kubernetes.io/projected/a1be778e-85bd-43d3-912c-0356362a7e8a-kube-api-access-c52kt\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775836 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-netns\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775861 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-systemd-units\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775881 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/511124f1-f198-4d6c-9713-d6f1375957e5-host\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775907 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-hostroot\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775935 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-conf-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775951 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36e9aa61-0f27-4d2c-abce-685977a97e00-ovn-node-metrics-cert\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775972 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-device-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.775993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-kubelet\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776024 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-etc-kubernetes\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776047 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-ovnkube-script-lib\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776070 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-slash\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-kubernetes\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776113 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-host\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776138 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-socket-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776163 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776189 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a844703d-9a8a-4877-a840-e850e06f82b0-agent-certs\") pod \"konnectivity-agent-7n47b\" (UID: \"a844703d-9a8a-4877-a840-e850e06f82b0\") " pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776212 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-system-cni-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776253 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-os-release\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776309 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1a05e8cf-847c-48cc-802b-171bcb5dea76-hosts-file\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776374 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-run-netns\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776408 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-etc-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776439 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-socket-dir-parent\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776462 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4b48\" (UniqueName: \"kubernetes.io/projected/1a05e8cf-847c-48cc-802b-171bcb5dea76-kube-api-access-k4b48\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776482 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-system-cni-dir\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776497 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.776603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvzkc\" (UniqueName: \"kubernetes.io/projected/eb6dd680-d8be-4220-b690-a82c23fa355f-kube-api-access-qvzkc\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776541 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-cni-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776571 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-cni-binary-copy\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776586 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-daemon-config\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776607 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-multus-certs\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776630 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776649 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2nqr\" (UniqueName: \"kubernetes.io/projected/e022d7cd-e433-4f58-8b33-7c830d23f95c-kube-api-access-g2nqr\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776666 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1640f00-dda4-4761-acce-37205e686361-iptables-alerter-script\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776702 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776741 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-os-release\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776764 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-run\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776800 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-registration-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776822 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-k8s-cni-cncf-io\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776845 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-var-lib-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776863 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/511124f1-f198-4d6c-9713-d6f1375957e5-serviceca\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776905 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-sys-fs\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.777470 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776930 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-cni-bin\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776962 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-systemd\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776979 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2wl\" (UniqueName: \"kubernetes.io/projected/b1640f00-dda4-4761-acce-37205e686361-kube-api-access-kq2wl\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.776993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-cni-bin\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777006 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-systemd\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777045 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-sys\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777073 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777098 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xkpr\" (UniqueName: \"kubernetes.io/projected/511124f1-f198-4d6c-9713-d6f1375957e5-kube-api-access-5xkpr\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777125 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mtsm\" (UniqueName: \"kubernetes.io/projected/265548b5-1968-424e-850b-1b95c8e7798f-kube-api-access-2mtsm\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777148 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a05e8cf-847c-48cc-802b-171bcb5dea76-tmp-dir\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777168 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-node-log\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777191 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777215 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysctl-conf\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777232 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-modprobe-d\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777265 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysctl-d\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777285 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-tuned\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777311 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a844703d-9a8a-4877-a840-e850e06f82b0-konnectivity-ca\") pod \"konnectivity-agent-7n47b\" (UID: \"a844703d-9a8a-4877-a840-e850e06f82b0\") " pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.778727 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777337 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-cni-multus\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.778727 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777376 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-cnibin\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.778727 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777402 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-var-lib-kubelet\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.778727 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.777422 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-cnibin\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.828650 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.828617 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:56:49 +0000 UTC" deadline="2027-11-14 14:55:29.713595372 +0000 UTC" Apr 21 16:01:50.828650 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.828647 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13726h53m38.884951115s" Apr 21 16:01:50.865927 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.865864 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 16:01:50.878441 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878416 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-node-log\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.878557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878451 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.878557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878469 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysctl-conf\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-modprobe-d\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878509 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysctl-d\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878553 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-tuned\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878570 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a844703d-9a8a-4877-a840-e850e06f82b0-konnectivity-ca\") pod \"konnectivity-agent-7n47b\" (UID: \"a844703d-9a8a-4877-a840-e850e06f82b0\") " pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878594 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-cni-multus\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878610 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysctl-conf\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878611 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-node-log\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878617 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-cnibin\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878683 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-var-lib-kubelet\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878706 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-cnibin\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878704 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-modprobe-d\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878740 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-cni-multus\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878753 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-cnibin\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878761 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysctl-d\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.878798 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878794 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-log-socket\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878815 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-cnibin\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878822 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-lib-modules\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878844 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-var-lib-kubelet\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878852 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-log-socket\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878846 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878893 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878909 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ndd\" (UniqueName: \"kubernetes.io/projected/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-kube-api-access-g4ndd\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878925 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-lib-modules\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878938 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-ovnkube-config\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878963 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-env-overrides\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.878987 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1640f00-dda4-4761-acce-37205e686361-host-slash\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879010 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-kubelet\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879038 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-run-ovn-kubernetes\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-cni-binary-copy\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879043 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879090 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.879355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879115 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-etc-selinux\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879123 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879140 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-ovn\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879167 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-cni-netd\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879172 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a844703d-9a8a-4877-a840-e850e06f82b0-konnectivity-ca\") pod \"konnectivity-agent-7n47b\" (UID: \"a844703d-9a8a-4877-a840-e850e06f82b0\") " pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879191 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcdpm\" (UniqueName: \"kubernetes.io/projected/36e9aa61-0f27-4d2c-abce-685977a97e00-kube-api-access-mcdpm\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879210 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-run-ovn-kubernetes\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879216 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysconfig\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879227 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1640f00-dda4-4761-acce-37205e686361-host-slash\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-kubelet\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879241 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb6dd680-d8be-4220-b690-a82c23fa355f-tmp\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879246 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-etc-selinux\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879268 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-cni-netd\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879295 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-ovn\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-ovnkube-config\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879604 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879629 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c52kt\" (UniqueName: \"kubernetes.io/projected/a1be778e-85bd-43d3-912c-0356362a7e8a-kube-api-access-c52kt\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.880128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879659 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-env-overrides\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879664 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-netns\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879633 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-sysconfig\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879690 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-systemd-units\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879715 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/511124f1-f198-4d6c-9713-d6f1375957e5-host\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879703 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-cni-binary-copy\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879726 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-netns\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879729 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-systemd-units\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879761 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/511124f1-f198-4d6c-9713-d6f1375957e5-host\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879741 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-hostroot\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879817 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-hostroot\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879819 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-conf-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879847 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36e9aa61-0f27-4d2c-abce-685977a97e00-ovn-node-metrics-cert\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879874 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-device-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879880 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-conf-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879898 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-kubelet\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879935 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-etc-kubernetes\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879938 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-device-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.880905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879959 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-ovnkube-script-lib\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879981 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-kubelet\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.879984 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-slash\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880015 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-etc-kubernetes\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880024 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-kubernetes\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880019 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-slash\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880055 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-host\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880084 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-socket-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880111 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880136 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a844703d-9a8a-4877-a840-e850e06f82b0-agent-certs\") pod \"konnectivity-agent-7n47b\" (UID: \"a844703d-9a8a-4877-a840-e850e06f82b0\") " pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880143 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-host\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880158 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-system-cni-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880184 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-os-release\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880214 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1a05e8cf-847c-48cc-802b-171bcb5dea76-hosts-file\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880262 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-run-netns\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880269 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-socket-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880308 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-etc-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880334 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-socket-dir-parent\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.881687 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880359 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1a05e8cf-847c-48cc-802b-171bcb5dea76-hosts-file\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880362 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4b48\" (UniqueName: \"kubernetes.io/projected/1a05e8cf-847c-48cc-802b-171bcb5dea76-kube-api-access-k4b48\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880396 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-etc-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880400 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-system-cni-dir\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880429 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-run-netns\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880459 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvzkc\" (UniqueName: \"kubernetes.io/projected/eb6dd680-d8be-4220-b690-a82c23fa355f-kube-api-access-qvzkc\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-cni-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880496 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-system-cni-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880513 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-cni-binary-copy\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880539 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-daemon-config\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880565 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-multus-certs\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880592 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880618 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2nqr\" (UniqueName: \"kubernetes.io/projected/e022d7cd-e433-4f58-8b33-7c830d23f95c-kube-api-access-g2nqr\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880631 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-socket-dir-parent\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880643 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1640f00-dda4-4761-acce-37205e686361-iptables-alerter-script\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880333 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-os-release\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.882517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880361 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-kubernetes\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880670 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-os-release\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880750 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-run\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880773 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-registration-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880817 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-k8s-cni-cncf-io\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880843 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-var-lib-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880850 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880866 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/511124f1-f198-4d6c-9713-d6f1375957e5-serviceca\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880902 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-sys-fs\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880928 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-cni-bin\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880955 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-systemd\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.880998 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2wl\" (UniqueName: \"kubernetes.io/projected/b1640f00-dda4-4761-acce-37205e686361-kube-api-access-kq2wl\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-cni-bin\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-systemd\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881071 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-sys\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881096 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:50.883400 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881114 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36e9aa61-0f27-4d2c-abce-685977a97e00-ovnkube-script-lib\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881134 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xkpr\" (UniqueName: \"kubernetes.io/projected/511124f1-f198-4d6c-9713-d6f1375957e5-kube-api-access-5xkpr\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881163 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mtsm\" (UniqueName: \"kubernetes.io/projected/265548b5-1968-424e-850b-1b95c8e7798f-kube-api-access-2mtsm\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881186 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-run\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881200 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a05e8cf-847c-48cc-802b-171bcb5dea76-tmp-dir\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881226 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881276 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/511124f1-f198-4d6c-9713-d6f1375957e5-serviceca\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881348 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-var-lib-cni-bin\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-multus-certs\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881523 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-host-run-k8s-cni-cncf-io\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881556 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-var-lib-openvswitch\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881594 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-sys\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881599 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-systemd\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881656 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-cni-dir\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881694 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1640f00-dda4-4761-acce-37205e686361-iptables-alerter-script\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.881706 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.881762 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:51.381743378 +0000 UTC m=+3.126681180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:50.884228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881885 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-system-cni-dir\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.881893 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/265548b5-1968-424e-850b-1b95c8e7798f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882043 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-run-systemd\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882060 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-multus-daemon-config\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882103 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-sys-fs\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882345 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a1be778e-85bd-43d3-912c-0356362a7e8a-registration-dir\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882350 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1a05e8cf-847c-48cc-802b-171bcb5dea76-tmp-dir\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb6dd680-d8be-4220-b690-a82c23fa355f-tmp\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882411 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/265548b5-1968-424e-850b-1b95c8e7798f-os-release\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-cni-binary-copy\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.882457 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36e9aa61-0f27-4d2c-abce-685977a97e00-host-cni-bin\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.883738 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eb6dd680-d8be-4220-b690-a82c23fa355f-etc-tuned\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.883817 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a844703d-9a8a-4877-a840-e850e06f82b0-agent-certs\") pod \"konnectivity-agent-7n47b\" (UID: \"a844703d-9a8a-4877-a840-e850e06f82b0\") " pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:50.885031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.884355 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36e9aa61-0f27-4d2c-abce-685977a97e00-ovn-node-metrics-cert\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.892991 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.892970 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:50.893178 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.892997 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:50.893178 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.893012 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:50.893178 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:50.893075 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:51.393055924 +0000 UTC m=+3.137993730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:50.893968 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.893938 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c52kt\" (UniqueName: \"kubernetes.io/projected/a1be778e-85bd-43d3-912c-0356362a7e8a-kube-api-access-c52kt\") pod \"aws-ebs-csi-driver-node-rprc8\" (UID: \"a1be778e-85bd-43d3-912c-0356362a7e8a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:50.895393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.894861 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvzkc\" (UniqueName: \"kubernetes.io/projected/eb6dd680-d8be-4220-b690-a82c23fa355f-kube-api-access-qvzkc\") pod \"tuned-c5kmx\" (UID: \"eb6dd680-d8be-4220-b690-a82c23fa355f\") " pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:50.895393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.895328 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2wl\" (UniqueName: \"kubernetes.io/projected/b1640f00-dda4-4761-acce-37205e686361-kube-api-access-kq2wl\") pod \"iptables-alerter-qsmcw\" (UID: \"b1640f00-dda4-4761-acce-37205e686361\") " pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:50.895622 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.895601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4b48\" (UniqueName: \"kubernetes.io/projected/1a05e8cf-847c-48cc-802b-171bcb5dea76-kube-api-access-k4b48\") pod \"node-resolver-4dmnb\" (UID: \"1a05e8cf-847c-48cc-802b-171bcb5dea76\") " pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:50.896657 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.896346 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcdpm\" (UniqueName: \"kubernetes.io/projected/36e9aa61-0f27-4d2c-abce-685977a97e00-kube-api-access-mcdpm\") pod \"ovnkube-node-np85v\" (UID: \"36e9aa61-0f27-4d2c-abce-685977a97e00\") " pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:50.896657 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.896616 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xkpr\" (UniqueName: \"kubernetes.io/projected/511124f1-f198-4d6c-9713-d6f1375957e5-kube-api-access-5xkpr\") pod \"node-ca-chrww\" (UID: \"511124f1-f198-4d6c-9713-d6f1375957e5\") " pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:50.897241 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.897223 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mtsm\" (UniqueName: \"kubernetes.io/projected/265548b5-1968-424e-850b-1b95c8e7798f-kube-api-access-2mtsm\") pod \"multus-additional-cni-plugins-bskpq\" (UID: \"265548b5-1968-424e-850b-1b95c8e7798f\") " pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:50.897645 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.897614 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2nqr\" (UniqueName: \"kubernetes.io/projected/e022d7cd-e433-4f58-8b33-7c830d23f95c-kube-api-access-g2nqr\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:50.898114 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:50.898084 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ndd\" (UniqueName: \"kubernetes.io/projected/c8f9fcd0-5378-4da1-a89a-2ffad35fe389-kube-api-access-g4ndd\") pod \"multus-fm4kz\" (UID: \"c8f9fcd0-5378-4da1-a89a-2ffad35fe389\") " pod="openshift-multus/multus-fm4kz" Apr 21 16:01:51.062649 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.062608 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fm4kz" Apr 21 16:01:51.069583 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.069557 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dmnb" Apr 21 16:01:51.078166 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.078147 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bskpq" Apr 21 16:01:51.083858 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.083831 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" Apr 21 16:01:51.090419 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.090399 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:01:51.096926 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.096910 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qsmcw" Apr 21 16:01:51.104484 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.104465 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:01:51.109059 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.109038 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" Apr 21 16:01:51.114579 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.114562 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-chrww" Apr 21 16:01:51.384433 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.384404 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:51.384618 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.384567 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:51.384691 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.384634 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:52.384615089 +0000 UTC m=+4.129552878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:51.485524 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.485499 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:51.485632 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.485619 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:51.485685 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.485636 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:51.485685 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.485644 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:51.485766 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.485687 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:52.485675181 +0000 UTC m=+4.230612964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:51.493686 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.493660 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8f9fcd0_5378_4da1_a89a_2ffad35fe389.slice/crio-c3d148b3f8a4047b6ab3695885f93945b6fc2d52477edb9b653e41b3907e4858 WatchSource:0}: Error finding container c3d148b3f8a4047b6ab3695885f93945b6fc2d52477edb9b653e41b3907e4858: Status 404 returned error can't find the container with id c3d148b3f8a4047b6ab3695885f93945b6fc2d52477edb9b653e41b3907e4858 Apr 21 16:01:51.495238 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.495175 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1640f00_dda4_4761_acce_37205e686361.slice/crio-9fa53e33371875a98fd7cb78423259da3b18f527c046c9ab2f4de8684da734ef WatchSource:0}: Error finding container 9fa53e33371875a98fd7cb78423259da3b18f527c046c9ab2f4de8684da734ef: Status 404 returned error can't find the container with id 9fa53e33371875a98fd7cb78423259da3b18f527c046c9ab2f4de8684da734ef Apr 21 16:01:51.499933 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.499913 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6dd680_d8be_4220_b690_a82c23fa355f.slice/crio-2ef74bfd38b58d9c363bef927c20041b9ef9d7ac5058f7a89ef4b1eff84209a4 WatchSource:0}: Error finding container 2ef74bfd38b58d9c363bef927c20041b9ef9d7ac5058f7a89ef4b1eff84209a4: Status 404 returned error can't find the container with id 2ef74bfd38b58d9c363bef927c20041b9ef9d7ac5058f7a89ef4b1eff84209a4 Apr 21 16:01:51.500984 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.500964 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda844703d_9a8a_4877_a840_e850e06f82b0.slice/crio-02c166b89ea12d97b32a5fec7e4a497397f78ae49890d122e750d9e424537bc5 WatchSource:0}: Error finding container 02c166b89ea12d97b32a5fec7e4a497397f78ae49890d122e750d9e424537bc5: Status 404 returned error can't find the container with id 02c166b89ea12d97b32a5fec7e4a497397f78ae49890d122e750d9e424537bc5 Apr 21 16:01:51.501768 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.501681 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod511124f1_f198_4d6c_9713_d6f1375957e5.slice/crio-97120231b0568d7e551cc27900a7cbbaebf2fbde029575dac42d62bd0f35bb76 WatchSource:0}: Error finding container 97120231b0568d7e551cc27900a7cbbaebf2fbde029575dac42d62bd0f35bb76: Status 404 returned error can't find the container with id 97120231b0568d7e551cc27900a7cbbaebf2fbde029575dac42d62bd0f35bb76 Apr 21 16:01:51.502932 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.502873 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a05e8cf_847c_48cc_802b_171bcb5dea76.slice/crio-8a223443c4594189771bac30d6f3f829d0ac0e355c116106f921d4445357adf0 WatchSource:0}: Error finding container 8a223443c4594189771bac30d6f3f829d0ac0e355c116106f921d4445357adf0: Status 404 returned error can't find the container with id 8a223443c4594189771bac30d6f3f829d0ac0e355c116106f921d4445357adf0 Apr 21 16:01:51.503408 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.503376 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod265548b5_1968_424e_850b_1b95c8e7798f.slice/crio-6d8866e36095c3a7c06672bc33de033fcd272ec8fa395a6364b31c5cc380a70e WatchSource:0}: Error finding container 6d8866e36095c3a7c06672bc33de033fcd272ec8fa395a6364b31c5cc380a70e: Status 404 returned error can't find the container with id 6d8866e36095c3a7c06672bc33de033fcd272ec8fa395a6364b31c5cc380a70e Apr 21 16:01:51.504866 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.504375 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1be778e_85bd_43d3_912c_0356362a7e8a.slice/crio-5c7c0a0bd7ee8c460665ba3bc2bb2619d197af7cc8d4a0d8d000f0c876730466 WatchSource:0}: Error finding container 5c7c0a0bd7ee8c460665ba3bc2bb2619d197af7cc8d4a0d8d000f0c876730466: Status 404 returned error can't find the container with id 5c7c0a0bd7ee8c460665ba3bc2bb2619d197af7cc8d4a0d8d000f0c876730466 Apr 21 16:01:51.505457 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:01:51.505312 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e9aa61_0f27_4d2c_abce_685977a97e00.slice/crio-5031f12724742a7db2d0a1908af08865c574298c47be17cc3e66af9b76e3d1a2 WatchSource:0}: Error finding container 5031f12724742a7db2d0a1908af08865c574298c47be17cc3e66af9b76e3d1a2: Status 404 returned error can't find the container with id 5031f12724742a7db2d0a1908af08865c574298c47be17cc3e66af9b76e3d1a2 Apr 21 16:01:51.829637 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.829568 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:56:49 +0000 UTC" deadline="2027-12-22 14:10:46.456248812 +0000 UTC" Apr 21 16:01:51.829637 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.829604 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14638h8m54.626648228s" Apr 21 16:01:51.906711 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.906310 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:51.906711 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.906419 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:01:51.906711 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.906705 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:51.907010 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:51.906850 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:01:51.915768 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.915703 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerStarted","Data":"6d8866e36095c3a7c06672bc33de033fcd272ec8fa395a6364b31c5cc380a70e"} Apr 21 16:01:51.923375 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.923342 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7n47b" event={"ID":"a844703d-9a8a-4877-a840-e850e06f82b0","Type":"ContainerStarted","Data":"02c166b89ea12d97b32a5fec7e4a497397f78ae49890d122e750d9e424537bc5"} Apr 21 16:01:51.928042 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.927995 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" event={"ID":"eb6dd680-d8be-4220-b690-a82c23fa355f","Type":"ContainerStarted","Data":"2ef74bfd38b58d9c363bef927c20041b9ef9d7ac5058f7a89ef4b1eff84209a4"} Apr 21 16:01:51.932803 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.932742 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qsmcw" event={"ID":"b1640f00-dda4-4761-acce-37205e686361","Type":"ContainerStarted","Data":"9fa53e33371875a98fd7cb78423259da3b18f527c046c9ab2f4de8684da734ef"} Apr 21 16:01:51.941855 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.941278 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" event={"ID":"ee115be6bbf3231206ae6c74733c2779","Type":"ContainerStarted","Data":"04e0ed85be365674b7bf33bffdbab5c466317d4187b3e6ab64f55dffa288a8fb"} Apr 21 16:01:51.944157 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.944109 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" event={"ID":"a1be778e-85bd-43d3-912c-0356362a7e8a","Type":"ContainerStarted","Data":"5c7c0a0bd7ee8c460665ba3bc2bb2619d197af7cc8d4a0d8d000f0c876730466"} Apr 21 16:01:51.946223 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.946140 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dmnb" event={"ID":"1a05e8cf-847c-48cc-802b-171bcb5dea76","Type":"ContainerStarted","Data":"8a223443c4594189771bac30d6f3f829d0ac0e355c116106f921d4445357adf0"} Apr 21 16:01:51.950934 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.950909 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-chrww" event={"ID":"511124f1-f198-4d6c-9713-d6f1375957e5","Type":"ContainerStarted","Data":"97120231b0568d7e551cc27900a7cbbaebf2fbde029575dac42d62bd0f35bb76"} Apr 21 16:01:51.953315 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.953271 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fm4kz" event={"ID":"c8f9fcd0-5378-4da1-a89a-2ffad35fe389","Type":"ContainerStarted","Data":"c3d148b3f8a4047b6ab3695885f93945b6fc2d52477edb9b653e41b3907e4858"} Apr 21 16:01:51.956048 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.955993 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-158.ec2.internal" podStartSLOduration=1.955978408 podStartE2EDuration="1.955978408s" podCreationTimestamp="2026-04-21 16:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:01:51.95585073 +0000 UTC m=+3.700788534" watchObservedRunningTime="2026-04-21 16:01:51.955978408 +0000 UTC m=+3.700916214" Apr 21 16:01:51.958697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:51.958669 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"5031f12724742a7db2d0a1908af08865c574298c47be17cc3e66af9b76e3d1a2"} Apr 21 16:01:52.394156 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.394119 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:52.394313 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.394270 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:52.394368 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.394328 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:54.394311295 +0000 UTC m=+6.139249084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:52.495373 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.494704 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:52.495373 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.494925 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:52.495373 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.494943 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:52.495373 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.494955 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:52.495373 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.495011 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:54.494993624 +0000 UTC m=+6.239931412 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:52.497799 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.497222 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wpnfj"] Apr 21 16:01:52.501849 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.500733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.501849 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.500852 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:01:52.595519 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.595279 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.595519 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.595347 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/427cd153-57f3-494e-8f29-f4e3e984756d-dbus\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.595519 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.595385 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/427cd153-57f3-494e-8f29-f4e3e984756d-kubelet-config\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.695961 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/427cd153-57f3-494e-8f29-f4e3e984756d-dbus\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.696021 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/427cd153-57f3-494e-8f29-f4e3e984756d-kubelet-config\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.696091 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.696228 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:52.696286 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret podName:427cd153-57f3-494e-8f29-f4e3e984756d nodeName:}" failed. No retries permitted until 2026-04-21 16:01:53.196268615 +0000 UTC m=+4.941206398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret") pod "global-pull-secret-syncer-wpnfj" (UID: "427cd153-57f3-494e-8f29-f4e3e984756d") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.696612 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/427cd153-57f3-494e-8f29-f4e3e984756d-dbus\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.696706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.696671 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/427cd153-57f3-494e-8f29-f4e3e984756d-kubelet-config\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:52.981682 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.981592 2562 generic.go:358] "Generic (PLEG): container finished" podID="731e093ceab326bbc076053ea8678ffb" containerID="decaeaff088e34781207503b4af01e518b935b9c69859aa1a3971907b0244300" exitCode=0 Apr 21 16:01:52.982188 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:52.981687 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" event={"ID":"731e093ceab326bbc076053ea8678ffb","Type":"ContainerDied","Data":"decaeaff088e34781207503b4af01e518b935b9c69859aa1a3971907b0244300"} Apr 21 16:01:53.201145 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:53.200984 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:53.201314 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:53.201161 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:53.201314 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:53.201216 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret podName:427cd153-57f3-494e-8f29-f4e3e984756d nodeName:}" failed. No retries permitted until 2026-04-21 16:01:54.201198777 +0000 UTC m=+5.946136566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret") pod "global-pull-secret-syncer-wpnfj" (UID: "427cd153-57f3-494e-8f29-f4e3e984756d") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:53.906583 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:53.906510 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:53.906731 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:53.906640 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:01:53.907083 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:53.907062 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:53.907203 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:53.907183 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:01:53.907260 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:53.907233 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:53.907314 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:53.907294 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:01:53.987048 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:53.987014 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" event={"ID":"731e093ceab326bbc076053ea8678ffb","Type":"ContainerStarted","Data":"ae6712b71297607645410c84270976b5c9d7e05ae902630f8fb5ce40942d50fd"} Apr 21 16:01:54.209357 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:54.209271 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:54.209520 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.209422 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:54.209520 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.209496 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret podName:427cd153-57f3-494e-8f29-f4e3e984756d nodeName:}" failed. No retries permitted until 2026-04-21 16:01:56.20947035 +0000 UTC m=+7.954408135 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret") pod "global-pull-secret-syncer-wpnfj" (UID: "427cd153-57f3-494e-8f29-f4e3e984756d") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:54.411730 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:54.411694 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:54.411957 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.411869 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:54.411957 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.411944 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:58.411924797 +0000 UTC m=+10.156862589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:54.513144 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:54.513096 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:54.513334 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.513255 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:54.513334 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.513278 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:54.513334 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.513290 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:54.513508 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:54.513350 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:01:58.513333088 +0000 UTC m=+10.258270875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:55.907302 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:55.906810 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:55.907302 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:55.906950 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:01:55.907302 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:55.907031 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:55.907302 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:55.907156 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:01:55.907302 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:55.907031 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:55.907302 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:55.907276 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:01:56.228484 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:56.228395 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:56.228652 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:56.228559 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:56.228652 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:56.228639 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret podName:427cd153-57f3-494e-8f29-f4e3e984756d nodeName:}" failed. No retries permitted until 2026-04-21 16:02:00.228617881 +0000 UTC m=+11.973555678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret") pod "global-pull-secret-syncer-wpnfj" (UID: "427cd153-57f3-494e-8f29-f4e3e984756d") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:57.906714 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:57.906585 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:57.907189 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:57.906718 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:01:57.907189 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:57.906827 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:57.907189 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:57.906932 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:01:57.907189 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:57.907022 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:57.907189 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:57.907101 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:01:58.445840 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:58.445799 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:58.446050 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:58.445890 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:58.446050 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:58.445977 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:06.445958201 +0000 UTC m=+18.190895999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:58.546324 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:58.546284 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:58.546583 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:58.546560 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:58.546650 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:58.546592 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:58.546650 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:58.546608 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:58.546750 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:58.546671 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:06.546650157 +0000 UTC m=+18.291587964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:59.906254 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:59.906222 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:01:59.906700 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:59.906335 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:01:59.906700 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:59.906346 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:01:59.906700 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:59.906446 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:01:59.906700 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:01:59.906499 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:01:59.906700 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:01:59.906589 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:00.259195 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:00.259160 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:00.259363 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:00.259350 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:02:00.259440 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:00.259416 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret podName:427cd153-57f3-494e-8f29-f4e3e984756d nodeName:}" failed. No retries permitted until 2026-04-21 16:02:08.259395652 +0000 UTC m=+20.004333447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret") pod "global-pull-secret-syncer-wpnfj" (UID: "427cd153-57f3-494e-8f29-f4e3e984756d") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:02:01.906733 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:01.906702 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:01.907196 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:01.906702 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:01.907196 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:01.906847 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:01.907196 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:01.906703 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:01.907196 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:01.906901 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:01.907196 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:01.906956 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:03.906471 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:03.906391 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:03.906907 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:03.906399 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:03.906907 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:03.906509 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:03.906907 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:03.906399 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:03.906907 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:03.906582 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:03.906907 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:03.906682 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:05.906634 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:05.906600 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:05.906634 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:05.906620 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:05.907220 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:05.906740 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:05.907220 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:05.906761 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:05.907220 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:05.906899 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:05.907220 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:05.907007 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:06.504511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:06.504472 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:06.507378 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:06.505452 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:06.507378 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:06.505541 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:22.50552019 +0000 UTC m=+34.250457992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:06.605021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:06.604988 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:06.605209 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:06.605187 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:02:06.605264 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:06.605216 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:02:06.605264 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:06.605232 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:06.605341 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:06.605309 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:22.605291046 +0000 UTC m=+34.350228842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:07.906838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:07.906807 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:07.907216 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:07.906809 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:07.907216 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:07.906813 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:07.907216 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:07.906976 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:07.907216 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:07.907056 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:07.907216 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:07.906900 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:08.320475 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:08.319774 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:08.320475 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:08.319944 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:02:08.320475 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:08.320002 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret podName:427cd153-57f3-494e-8f29-f4e3e984756d nodeName:}" failed. No retries permitted until 2026-04-21 16:02:24.31998405 +0000 UTC m=+36.064921854 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret") pod "global-pull-secret-syncer-wpnfj" (UID: "427cd153-57f3-494e-8f29-f4e3e984756d") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:02:09.013429 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.013218 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" event={"ID":"a1be778e-85bd-43d3-912c-0356362a7e8a","Type":"ContainerStarted","Data":"2ff7264091deef4d8be604e2946b6d36162e662a37f0e4004e56ccde1ab278c3"} Apr 21 16:02:09.014737 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.014708 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dmnb" event={"ID":"1a05e8cf-847c-48cc-802b-171bcb5dea76","Type":"ContainerStarted","Data":"bc0675bb9518369ca31e46262d23c72c2894a420230f87a819f65611ae8bf845"} Apr 21 16:02:09.016194 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.016163 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-chrww" event={"ID":"511124f1-f198-4d6c-9713-d6f1375957e5","Type":"ContainerStarted","Data":"3eb8e9e0a8f2fd7dfbd27635fbe081260948456e9b15e8713184b4056cdcb643"} Apr 21 16:02:09.017632 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.017607 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fm4kz" event={"ID":"c8f9fcd0-5378-4da1-a89a-2ffad35fe389","Type":"ContainerStarted","Data":"85fba6ec7706b694cf54629623c5acc55363fc2ca2173afef18becb04f3486a0"} Apr 21 16:02:09.020526 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.020500 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"291c50941b62f979c745fab10435111abfe605946afef0f861e4c9ccb55abae2"} Apr 21 16:02:09.020626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.020528 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"d0a788762221bbcfad22ca1103b62b30e9267d4919514f4027daf70f7d604f00"} Apr 21 16:02:09.020626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.020540 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"a43a6a0fa3f6ace3d9bee0842c16b961c3b66137943c9c429b4305c1fd3a63ad"} Apr 21 16:02:09.020626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.020551 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"769ac9d7a82ee7149e8797b866c32b6e9b633d1b478e39fad02e4ee3eb42f05f"} Apr 21 16:02:09.020626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.020559 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"3fbb29cfa74fa8b9f3059d9d9ffbf376bb065f18813654fe865a6d7a53d4a627"} Apr 21 16:02:09.022395 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.022313 2562 generic.go:358] "Generic (PLEG): container finished" podID="265548b5-1968-424e-850b-1b95c8e7798f" containerID="81171298f21df9f01d686a3ae9f60849089ab568c09eb9269e16a5fbedb8e19f" exitCode=0 Apr 21 16:02:09.022395 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.022377 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerDied","Data":"81171298f21df9f01d686a3ae9f60849089ab568c09eb9269e16a5fbedb8e19f"} Apr 21 16:02:09.024005 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.023978 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7n47b" event={"ID":"a844703d-9a8a-4877-a840-e850e06f82b0","Type":"ContainerStarted","Data":"b3eea3666646d71cd978413a21cf8031e1faa9af11c4c3fe4f1171644e8e99d4"} Apr 21 16:02:09.025718 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.025690 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" event={"ID":"eb6dd680-d8be-4220-b690-a82c23fa355f","Type":"ContainerStarted","Data":"a73976df5a3a12638751ddda4f3fa62caa5c7c957b5b22e5cb6124098481c1d0"} Apr 21 16:02:09.032022 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.031978 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4dmnb" podStartSLOduration=4.5559091590000005 podStartE2EDuration="21.031964596s" podCreationTimestamp="2026-04-21 16:01:48 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.504602514 +0000 UTC m=+3.249540314" lastFinishedPulling="2026-04-21 16:02:07.980657967 +0000 UTC m=+19.725595751" observedRunningTime="2026-04-21 16:02:09.031293344 +0000 UTC m=+20.776231343" watchObservedRunningTime="2026-04-21 16:02:09.031964596 +0000 UTC m=+20.776902401" Apr 21 16:02:09.032136 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.032105 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-158.ec2.internal" podStartSLOduration=19.03209738 podStartE2EDuration="19.03209738s" podCreationTimestamp="2026-04-21 16:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:01:54.002585199 +0000 UTC m=+5.747523005" watchObservedRunningTime="2026-04-21 16:02:09.03209738 +0000 UTC m=+20.777035184" Apr 21 16:02:09.052249 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.052205 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fm4kz" podStartSLOduration=4.55350635 podStartE2EDuration="21.052195102s" podCreationTimestamp="2026-04-21 16:01:48 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.496569904 +0000 UTC m=+3.241507693" lastFinishedPulling="2026-04-21 16:02:07.995258659 +0000 UTC m=+19.740196445" observedRunningTime="2026-04-21 16:02:09.052016111 +0000 UTC m=+20.796953914" watchObservedRunningTime="2026-04-21 16:02:09.052195102 +0000 UTC m=+20.797132905" Apr 21 16:02:09.100138 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.100101 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-c5kmx" podStartSLOduration=4.621470335 podStartE2EDuration="21.100091326s" podCreationTimestamp="2026-04-21 16:01:48 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.50204815 +0000 UTC m=+3.246985947" lastFinishedPulling="2026-04-21 16:02:07.980669138 +0000 UTC m=+19.725606938" observedRunningTime="2026-04-21 16:02:09.099637941 +0000 UTC m=+20.844575746" watchObservedRunningTime="2026-04-21 16:02:09.100091326 +0000 UTC m=+20.845029128" Apr 21 16:02:09.115706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.115661 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7n47b" podStartSLOduration=3.636380045 podStartE2EDuration="20.115651489s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.503080148 +0000 UTC m=+3.248017933" lastFinishedPulling="2026-04-21 16:02:07.982351593 +0000 UTC m=+19.727289377" observedRunningTime="2026-04-21 16:02:09.115427208 +0000 UTC m=+20.860365011" watchObservedRunningTime="2026-04-21 16:02:09.115651489 +0000 UTC m=+20.860589292" Apr 21 16:02:09.561350 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.561142 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 16:02:09.652436 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.652404 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:02:09.653598 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.653576 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:02:09.677814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.677759 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-chrww" podStartSLOduration=4.223157598 podStartE2EDuration="20.677744961s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.504646027 +0000 UTC m=+3.249583825" lastFinishedPulling="2026-04-21 16:02:07.959233399 +0000 UTC m=+19.704171188" observedRunningTime="2026-04-21 16:02:09.137824436 +0000 UTC m=+20.882762239" watchObservedRunningTime="2026-04-21 16:02:09.677744961 +0000 UTC m=+21.422682784" Apr 21 16:02:09.840272 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.840189 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T16:02:09.561327762Z","UUID":"bc0731e5-e97f-4fb3-a99c-a1f6b60a159c","Handler":null,"Name":"","Endpoint":""} Apr 21 16:02:09.841980 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.841930 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 16:02:09.841980 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.841978 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 16:02:09.906928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.906889 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:09.907077 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.906995 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:09.907077 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:09.906997 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:09.907077 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:09.907070 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:09.907253 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:09.907089 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:09.907253 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:09.907164 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:10.030688 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:10.030654 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"59d747bf407eb6f20592e7f201d0886efac231004f5ca9f9a6a1ed4eae0a2a7f"} Apr 21 16:02:10.032060 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:10.032036 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qsmcw" event={"ID":"b1640f00-dda4-4761-acce-37205e686361","Type":"ContainerStarted","Data":"7f5383c79f06821b56ac6be5976e2d9e0367c8345a9f21e88b5ba5cc5caafd7c"} Apr 21 16:02:10.036208 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:10.036175 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" event={"ID":"a1be778e-85bd-43d3-912c-0356362a7e8a","Type":"ContainerStarted","Data":"96eb595d872108149161a07ec1121c4a80094a8d94321fb815eb1a165e1c1a1e"} Apr 21 16:02:10.037025 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:10.037007 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:02:10.037128 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:10.037056 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7n47b" Apr 21 16:02:10.050070 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:10.050022 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qsmcw" podStartSLOduration=4.559882277 podStartE2EDuration="21.050006103s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.497159454 +0000 UTC m=+3.242097243" lastFinishedPulling="2026-04-21 16:02:07.987283278 +0000 UTC m=+19.732221069" observedRunningTime="2026-04-21 16:02:10.04922674 +0000 UTC m=+21.794164547" watchObservedRunningTime="2026-04-21 16:02:10.050006103 +0000 UTC m=+21.794943936" Apr 21 16:02:11.039856 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:11.039822 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" event={"ID":"a1be778e-85bd-43d3-912c-0356362a7e8a","Type":"ContainerStarted","Data":"36544e1c130107a56224fcb1e63227fa1d96b23b285c9a912de4ebf9f71f166e"} Apr 21 16:02:11.067659 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:11.067617 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rprc8" podStartSLOduration=2.741729084 podStartE2EDuration="22.06760433s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.506228646 +0000 UTC m=+3.251166447" lastFinishedPulling="2026-04-21 16:02:10.832103899 +0000 UTC m=+22.577041693" observedRunningTime="2026-04-21 16:02:11.067349904 +0000 UTC m=+22.812287718" watchObservedRunningTime="2026-04-21 16:02:11.06760433 +0000 UTC m=+22.812542134" Apr 21 16:02:11.906352 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:11.906321 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:11.906512 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:11.906440 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:11.906512 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:11.906457 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:11.906604 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:11.906431 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:11.906604 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:11.906551 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:11.906696 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:11.906620 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:12.044913 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:12.044873 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"9e05513ce40e5292a0d3cb0604100c611c3a10c6f9f5bee60eb2569123c910ee"} Apr 21 16:02:13.906861 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:13.906637 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:13.907214 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:13.906637 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:13.907214 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:13.906895 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:13.907214 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:13.906640 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:13.907214 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:13.906957 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:13.907214 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:13.907038 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:14.051319 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.051279 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" event={"ID":"36e9aa61-0f27-4d2c-abce-685977a97e00","Type":"ContainerStarted","Data":"54fb2ff365fb36f8a02dfa960e55a29b2bf009cdf6ba250119126b57161dad3a"} Apr 21 16:02:14.051757 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.051658 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:02:14.051757 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.051682 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:02:14.051757 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.051694 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:02:14.053050 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.053029 2562 generic.go:358] "Generic (PLEG): container finished" podID="265548b5-1968-424e-850b-1b95c8e7798f" containerID="530ee85f67a918ccc21931d81fd0f25c3c68a05430a075a7d61401de4dc27a80" exitCode=0 Apr 21 16:02:14.053162 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.053061 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerDied","Data":"530ee85f67a918ccc21931d81fd0f25c3c68a05430a075a7d61401de4dc27a80"} Apr 21 16:02:14.066596 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.066573 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:02:14.066676 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.066666 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:02:14.096994 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:14.096959 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" podStartSLOduration=8.326603508 podStartE2EDuration="25.09694885s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.507123509 +0000 UTC m=+3.252061292" lastFinishedPulling="2026-04-21 16:02:08.277468847 +0000 UTC m=+20.022406634" observedRunningTime="2026-04-21 16:02:14.094535664 +0000 UTC m=+25.839473468" watchObservedRunningTime="2026-04-21 16:02:14.09694885 +0000 UTC m=+25.841886652" Apr 21 16:02:15.484458 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:15.484277 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rg8v9"] Apr 21 16:02:15.484924 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:15.484558 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:15.484924 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:15.484645 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:15.492127 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:15.492097 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wpnfj"] Apr 21 16:02:15.492226 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:15.492212 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:15.492414 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:15.492368 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:15.492753 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:15.492732 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nnb5j"] Apr 21 16:02:15.492855 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:15.492843 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:15.492942 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:15.492925 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:16.058004 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:16.057970 2562 generic.go:358] "Generic (PLEG): container finished" podID="265548b5-1968-424e-850b-1b95c8e7798f" containerID="3daa740367147bb621fea90aecbc4e26cc54da27608c76ef12f6e0bf4c3ec4b6" exitCode=0 Apr 21 16:02:16.058148 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:16.058056 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerDied","Data":"3daa740367147bb621fea90aecbc4e26cc54da27608c76ef12f6e0bf4c3ec4b6"} Apr 21 16:02:16.906907 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:16.906875 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:16.906907 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:16.906884 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:16.907440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:16.907001 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:16.907440 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:16.907018 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:16.907440 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:16.907097 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:16.907440 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:16.907176 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:18.063450 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:18.063418 2562 generic.go:358] "Generic (PLEG): container finished" podID="265548b5-1968-424e-850b-1b95c8e7798f" containerID="25e355b4227d57af9ba5e530710c4544bc6df8eea837c4fdc96c365763703d03" exitCode=0 Apr 21 16:02:18.063974 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:18.063469 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerDied","Data":"25e355b4227d57af9ba5e530710c4544bc6df8eea837c4fdc96c365763703d03"} Apr 21 16:02:18.907826 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:18.907742 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:18.907977 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:18.907842 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:18.907977 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:18.907940 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:18.908151 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:18.908044 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:18.908151 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:18.908116 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:18.908250 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:18.908159 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:20.909515 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:20.909481 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:20.909515 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:20.909496 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:20.910100 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:20.909481 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:20.910100 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:20.909599 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wpnfj" podUID="427cd153-57f3-494e-8f29-f4e3e984756d" Apr 21 16:02:20.910100 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:20.909689 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:02:20.910100 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:20.909758 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nnb5j" podUID="07b526f2-af47-4107-850b-0185ac8ac28c" Apr 21 16:02:22.050598 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.050311 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-158.ec2.internal" event="NodeReady" Apr 21 16:02:22.051055 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.050656 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 16:02:22.108679 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.108649 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lxgkn"] Apr 21 16:02:22.150580 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.150542 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p8j5k"] Apr 21 16:02:22.150740 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.150729 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.154072 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.153869 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76wnm\"" Apr 21 16:02:22.154072 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.154030 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 16:02:22.154275 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.154107 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 16:02:22.154275 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.154208 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 16:02:22.183852 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.183832 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lxgkn"] Apr 21 16:02:22.183944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.183859 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p8j5k"] Apr 21 16:02:22.183988 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.183978 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.188511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.188488 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 16:02:22.188511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.188491 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 16:02:22.188647 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.188627 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2l66p\"" Apr 21 16:02:22.332316 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.332247 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.332316 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.332282 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.332316 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.332303 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh8jw\" (UniqueName: \"kubernetes.io/projected/8648a8db-b8ad-409e-ae80-85c058398baf-kube-api-access-jh8jw\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.332565 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.332375 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50caee65-e2ab-4233-a2b5-e5ea4a951bed-tmp-dir\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.332565 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.332436 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50caee65-e2ab-4233-a2b5-e5ea4a951bed-config-volume\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.332565 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.332474 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5kt\" (UniqueName: \"kubernetes.io/projected/50caee65-e2ab-4233-a2b5-e5ea4a951bed-kube-api-access-jp5kt\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.433631 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.433583 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.433631 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.433635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.433661 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh8jw\" (UniqueName: \"kubernetes.io/projected/8648a8db-b8ad-409e-ae80-85c058398baf-kube-api-access-jh8jw\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.433685 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50caee65-e2ab-4233-a2b5-e5ea4a951bed-tmp-dir\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.433725 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.433752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50caee65-e2ab-4233-a2b5-e5ea4a951bed-config-volume\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.433805 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5kt\" (UniqueName: \"kubernetes.io/projected/50caee65-e2ab-4233-a2b5-e5ea4a951bed-kube-api-access-jp5kt\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.433819 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:22.433869 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.433828 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:02:22.933805008 +0000 UTC m=+34.678742796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:22.434236 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.433889 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:02:22.933872213 +0000 UTC m=+34.678810000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:22.445888 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.445864 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh8jw\" (UniqueName: \"kubernetes.io/projected/8648a8db-b8ad-409e-ae80-85c058398baf-kube-api-access-jh8jw\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.450974 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.450948 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50caee65-e2ab-4233-a2b5-e5ea4a951bed-tmp-dir\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.451224 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.451193 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50caee65-e2ab-4233-a2b5-e5ea4a951bed-config-volume\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.451317 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.451206 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5kt\" (UniqueName: \"kubernetes.io/projected/50caee65-e2ab-4233-a2b5-e5ea4a951bed-kube-api-access-jp5kt\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.535681 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.535650 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:22.535866 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.535812 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:22.535920 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.535882 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:54.535868614 +0000 UTC m=+66.280806399 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:22.636747 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.636673 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:22.636876 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.636851 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:02:22.636876 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.636874 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:02:22.636949 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.636885 2562 projected.go:194] Error preparing data for projected volume kube-api-access-w89pd for pod openshift-network-diagnostics/network-check-target-nnb5j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:22.636949 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.636943 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd podName:07b526f2-af47-4107-850b-0185ac8ac28c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:54.636928749 +0000 UTC m=+66.381866531 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-w89pd" (UniqueName: "kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd") pod "network-check-target-nnb5j" (UID: "07b526f2-af47-4107-850b-0185ac8ac28c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:22.906049 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.905974 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:22.906242 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.905974 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:22.906242 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.905974 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:22.908819 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.908797 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 16:02:22.909865 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.909838 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 16:02:22.910001 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.909864 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cz78x\"" Apr 21 16:02:22.910001 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.909960 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jk8gf\"" Apr 21 16:02:22.910202 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.910168 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 16:02:22.910292 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.910228 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 16:02:22.939161 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.939138 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:22.939269 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:22.939172 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:22.939329 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.939295 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:22.939371 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.939355 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:02:23.939336785 +0000 UTC m=+35.684274575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:22.939371 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.939297 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:22.939457 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:22.939396 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:02:23.939387001 +0000 UTC m=+35.684324784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:23.947227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:23.947193 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:23.947227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:23.947227 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:23.947615 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:23.947345 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:23.947615 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:23.947423 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:02:25.947392949 +0000 UTC m=+37.692330732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:23.947615 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:23.947345 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:23.947615 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:23.947466 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:02:25.947455617 +0000 UTC m=+37.692393404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:24.350186 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:24.350150 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:24.353032 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:24.353000 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/427cd153-57f3-494e-8f29-f4e3e984756d-original-pull-secret\") pod \"global-pull-secret-syncer-wpnfj\" (UID: \"427cd153-57f3-494e-8f29-f4e3e984756d\") " pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:24.424817 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:24.424777 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wpnfj" Apr 21 16:02:24.571893 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:24.571864 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wpnfj"] Apr 21 16:02:24.574904 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:02:24.574877 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod427cd153_57f3_494e_8f29_f4e3e984756d.slice/crio-f59426ee599068f0d8d3be0d0072527e2bfb3976045f26d16a399325b2c0054c WatchSource:0}: Error finding container f59426ee599068f0d8d3be0d0072527e2bfb3976045f26d16a399325b2c0054c: Status 404 returned error can't find the container with id f59426ee599068f0d8d3be0d0072527e2bfb3976045f26d16a399325b2c0054c Apr 21 16:02:25.077100 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:25.076921 2562 generic.go:358] "Generic (PLEG): container finished" podID="265548b5-1968-424e-850b-1b95c8e7798f" containerID="8db673f28f83e0b07a71747842bc19612d3ee5395955650341c93904ab4c140b" exitCode=0 Apr 21 16:02:25.077474 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:25.076999 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerDied","Data":"8db673f28f83e0b07a71747842bc19612d3ee5395955650341c93904ab4c140b"} Apr 21 16:02:25.078074 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:25.078053 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wpnfj" event={"ID":"427cd153-57f3-494e-8f29-f4e3e984756d","Type":"ContainerStarted","Data":"f59426ee599068f0d8d3be0d0072527e2bfb3976045f26d16a399325b2c0054c"} Apr 21 16:02:25.960879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:25.960840 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:25.960879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:25.960880 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:25.961084 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:25.960998 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:25.961084 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:25.961002 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:25.961084 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:25.961047 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:02:29.961033819 +0000 UTC m=+41.705971601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:25.961084 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:25.961069 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:02:29.96105486 +0000 UTC m=+41.705992644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:26.082925 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:26.082889 2562 generic.go:358] "Generic (PLEG): container finished" podID="265548b5-1968-424e-850b-1b95c8e7798f" containerID="6bc3f0b222e8de4ab84107ce76e8c5beb064602dbd0c1d5509066d748d138ea9" exitCode=0 Apr 21 16:02:26.083436 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:26.082950 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerDied","Data":"6bc3f0b222e8de4ab84107ce76e8c5beb064602dbd0c1d5509066d748d138ea9"} Apr 21 16:02:27.088258 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:27.088218 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bskpq" event={"ID":"265548b5-1968-424e-850b-1b95c8e7798f","Type":"ContainerStarted","Data":"3a8848c26f5f7ddbf157a4a8a86b668e067b48431679b14425084252fee9527f"} Apr 21 16:02:27.131368 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:27.131324 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bskpq" podStartSLOduration=6.484894767 podStartE2EDuration="39.131310528s" podCreationTimestamp="2026-04-21 16:01:48 +0000 UTC" firstStartedPulling="2026-04-21 16:01:51.506228048 +0000 UTC m=+3.251165830" lastFinishedPulling="2026-04-21 16:02:24.15264374 +0000 UTC m=+35.897581591" observedRunningTime="2026-04-21 16:02:27.130082939 +0000 UTC m=+38.875020757" watchObservedRunningTime="2026-04-21 16:02:27.131310528 +0000 UTC m=+38.876248334" Apr 21 16:02:29.993210 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:29.993174 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:29.993210 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:29.993210 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:29.993666 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:29.993314 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:29.993666 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:29.993327 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:29.993666 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:29.993362 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:02:37.99334923 +0000 UTC m=+49.738287013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:29.993666 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:29.993393 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:02:37.993376254 +0000 UTC m=+49.738314043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:30.095508 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:30.095481 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wpnfj" event={"ID":"427cd153-57f3-494e-8f29-f4e3e984756d","Type":"ContainerStarted","Data":"6e2adb0beb256dc498ff9ad69179a3a4d0effece6c6e999a345e3622dc54b60e"} Apr 21 16:02:30.111699 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:30.111645 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wpnfj" podStartSLOduration=33.444030184 podStartE2EDuration="38.111628144s" podCreationTimestamp="2026-04-21 16:01:52 +0000 UTC" firstStartedPulling="2026-04-21 16:02:24.576709049 +0000 UTC m=+36.321646831" lastFinishedPulling="2026-04-21 16:02:29.244307005 +0000 UTC m=+40.989244791" observedRunningTime="2026-04-21 16:02:30.111362096 +0000 UTC m=+41.856299902" watchObservedRunningTime="2026-04-21 16:02:30.111628144 +0000 UTC m=+41.856565949" Apr 21 16:02:38.043675 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:38.043636 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:38.043675 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:38.043675 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:38.044126 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:38.043795 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:38.044126 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:38.043869 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:02:54.043852621 +0000 UTC m=+65.788790408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:38.044126 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:38.043798 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:38.044126 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:38.043931 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:02:54.04391958 +0000 UTC m=+65.788857363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:46.088980 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:46.088946 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np85v" Apr 21 16:02:54.051239 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.051201 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:02:54.051239 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.051244 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:02:54.051761 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:54.051349 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:54.051761 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:54.051410 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:03:26.051394088 +0000 UTC m=+97.796331870 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:02:54.051761 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:54.051416 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:54.051761 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:54.051477 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:03:26.051462174 +0000 UTC m=+97.796399957 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:02:54.554727 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.554694 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:02:54.557484 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.557467 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 16:02:54.565860 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:54.565837 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:02:54.565944 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:02:54.565911 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:03:58.565891442 +0000 UTC m=+130.310829242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : secret "metrics-daemon-secret" not found Apr 21 16:02:54.655840 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.655814 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:54.658528 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.658514 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 16:02:54.685653 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.685635 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 16:02:54.690792 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.690767 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89pd\" (UniqueName: \"kubernetes.io/projected/07b526f2-af47-4107-850b-0185ac8ac28c-kube-api-access-w89pd\") pod \"network-check-target-nnb5j\" (UID: \"07b526f2-af47-4107-850b-0185ac8ac28c\") " pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:54.720556 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.720539 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jk8gf\"" Apr 21 16:02:54.727845 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.727826 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:54.860938 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:54.860912 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nnb5j"] Apr 21 16:02:54.863766 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:02:54.863726 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b526f2_af47_4107_850b_0185ac8ac28c.slice/crio-0eb73d62f66fdbf1268ac2081c12d9443e65c50041ce8003e2635b7e20f14bc4 WatchSource:0}: Error finding container 0eb73d62f66fdbf1268ac2081c12d9443e65c50041ce8003e2635b7e20f14bc4: Status 404 returned error can't find the container with id 0eb73d62f66fdbf1268ac2081c12d9443e65c50041ce8003e2635b7e20f14bc4 Apr 21 16:02:55.140304 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:55.140230 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nnb5j" event={"ID":"07b526f2-af47-4107-850b-0185ac8ac28c","Type":"ContainerStarted","Data":"0eb73d62f66fdbf1268ac2081c12d9443e65c50041ce8003e2635b7e20f14bc4"} Apr 21 16:02:59.149529 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:59.149491 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nnb5j" event={"ID":"07b526f2-af47-4107-850b-0185ac8ac28c","Type":"ContainerStarted","Data":"9d6b24b381688cb23fa2de0f551209610d3ddc6c382da5eea53cae4a5010625b"} Apr 21 16:02:59.149969 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:59.149619 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:02:59.172283 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:02:59.172244 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nnb5j" podStartSLOduration=66.672352352 podStartE2EDuration="1m10.172230372s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:02:54.865620572 +0000 UTC m=+66.610558356" lastFinishedPulling="2026-04-21 16:02:58.365498592 +0000 UTC m=+70.110436376" observedRunningTime="2026-04-21 16:02:59.171867184 +0000 UTC m=+70.916804988" watchObservedRunningTime="2026-04-21 16:02:59.172230372 +0000 UTC m=+70.917168177" Apr 21 16:03:26.076701 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:26.076652 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:03:26.076701 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:26.076705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:03:26.077173 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:26.076823 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:03:26.077173 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:26.076882 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls podName:50caee65-e2ab-4233-a2b5-e5ea4a951bed nodeName:}" failed. No retries permitted until 2026-04-21 16:04:30.076867373 +0000 UTC m=+161.821805161 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls") pod "dns-default-p8j5k" (UID: "50caee65-e2ab-4233-a2b5-e5ea4a951bed") : secret "dns-default-metrics-tls" not found Apr 21 16:03:26.077173 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:26.076823 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:03:26.077173 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:26.076963 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert podName:8648a8db-b8ad-409e-ae80-85c058398baf nodeName:}" failed. No retries permitted until 2026-04-21 16:04:30.07695096 +0000 UTC m=+161.821888747 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert") pod "ingress-canary-lxgkn" (UID: "8648a8db-b8ad-409e-ae80-85c058398baf") : secret "canary-serving-cert" not found Apr 21 16:03:30.154489 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:30.154457 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nnb5j" Apr 21 16:03:52.017006 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.016975 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl"] Apr 21 16:03:52.018729 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.018712 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" Apr 21 16:03:52.022062 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.022029 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rss7r\"" Apr 21 16:03:52.022835 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.022819 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.022922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.022865 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.041411 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.041385 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl"] Apr 21 16:03:52.147839 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.147809 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjbmh\" (UniqueName: \"kubernetes.io/projected/f1504ffc-d02c-419c-92a1-e0f7dbab1932-kube-api-access-bjbmh\") pod \"volume-data-source-validator-7c6cbb6c87-qwrxl\" (UID: \"f1504ffc-d02c-419c-92a1-e0f7dbab1932\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" Apr 21 16:03:52.172579 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.172554 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r7jtg"] Apr 21 16:03:52.174370 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.174352 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.174441 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.174390 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7"] Apr 21 16:03:52.176139 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.176124 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.185442 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.185420 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-7ndtd\"" Apr 21 16:03:52.185826 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.185808 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.185826 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.185817 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.186741 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.186721 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 16:03:52.186844 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.186770 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 16:03:52.186989 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.186977 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.187112 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.187095 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-x6rzt\"" Apr 21 16:03:52.187223 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.187208 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 16:03:52.187223 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.187216 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 16:03:52.195416 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.195396 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r7jtg"] Apr 21 16:03:52.196047 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.196031 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7"] Apr 21 16:03:52.213816 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.213800 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.218336 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.218314 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 16:03:52.249103 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249074 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5d3a65e-5e28-4860-a01a-277b576a947b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.249103 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249100 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjkf\" (UniqueName: \"kubernetes.io/projected/be01fbb6-f686-41d2-aaa3-1abd80d94c27-kube-api-access-qqjkf\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.249260 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249120 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be01fbb6-f686-41d2-aaa3-1abd80d94c27-service-ca-bundle\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.249260 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249147 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be01fbb6-f686-41d2-aaa3-1abd80d94c27-serving-cert\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.249260 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249219 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/be01fbb6-f686-41d2-aaa3-1abd80d94c27-snapshots\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.249260 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249243 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be01fbb6-f686-41d2-aaa3-1abd80d94c27-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.249465 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249282 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be01fbb6-f686-41d2-aaa3-1abd80d94c27-tmp\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.249465 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249325 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjbmh\" (UniqueName: \"kubernetes.io/projected/f1504ffc-d02c-419c-92a1-e0f7dbab1932-kube-api-access-bjbmh\") pod \"volume-data-source-validator-7c6cbb6c87-qwrxl\" (UID: \"f1504ffc-d02c-419c-92a1-e0f7dbab1932\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" Apr 21 16:03:52.249465 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249385 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d3a65e-5e28-4860-a01a-277b576a947b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.249465 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.249405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsh7d\" (UniqueName: \"kubernetes.io/projected/c5d3a65e-5e28-4860-a01a-277b576a947b-kube-api-access-dsh7d\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.275707 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.275651 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjbmh\" (UniqueName: \"kubernetes.io/projected/f1504ffc-d02c-419c-92a1-e0f7dbab1932-kube-api-access-bjbmh\") pod \"volume-data-source-validator-7c6cbb6c87-qwrxl\" (UID: \"f1504ffc-d02c-419c-92a1-e0f7dbab1932\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" Apr 21 16:03:52.293586 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.293564 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj"] Apr 21 16:03:52.295276 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.295262 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hg85z"] Apr 21 16:03:52.295407 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.295390 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" Apr 21 16:03:52.296890 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.296873 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.297880 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.297861 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lhdw2\"" Apr 21 16:03:52.304364 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.304343 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 16:03:52.304943 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.304925 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 16:03:52.305032 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.304966 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:03:52.305218 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.305206 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6srph\"" Apr 21 16:03:52.305452 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.305439 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 16:03:52.310626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.310608 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj"] Apr 21 16:03:52.319441 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.319422 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hg85z"] Apr 21 16:03:52.321849 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.321831 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 16:03:52.326869 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.326846 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" Apr 21 16:03:52.350448 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350426 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/be01fbb6-f686-41d2-aaa3-1abd80d94c27-snapshots\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.350570 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350458 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be01fbb6-f686-41d2-aaa3-1abd80d94c27-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.350570 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350516 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be01fbb6-f686-41d2-aaa3-1abd80d94c27-tmp\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.350570 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d3a65e-5e28-4860-a01a-277b576a947b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.350705 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350580 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsh7d\" (UniqueName: \"kubernetes.io/projected/c5d3a65e-5e28-4860-a01a-277b576a947b-kube-api-access-dsh7d\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.350705 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350621 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5d3a65e-5e28-4860-a01a-277b576a947b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.350815 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350774 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjkf\" (UniqueName: \"kubernetes.io/projected/be01fbb6-f686-41d2-aaa3-1abd80d94c27-kube-api-access-qqjkf\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.350873 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350852 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be01fbb6-f686-41d2-aaa3-1abd80d94c27-service-ca-bundle\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.350924 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.350905 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be01fbb6-f686-41d2-aaa3-1abd80d94c27-serving-cert\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.351112 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.351090 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/be01fbb6-f686-41d2-aaa3-1abd80d94c27-snapshots\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.351341 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.351324 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be01fbb6-f686-41d2-aaa3-1abd80d94c27-tmp\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.351462 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.351444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5d3a65e-5e28-4860-a01a-277b576a947b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.352002 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.351982 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be01fbb6-f686-41d2-aaa3-1abd80d94c27-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.352304 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.352287 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be01fbb6-f686-41d2-aaa3-1abd80d94c27-service-ca-bundle\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.353300 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.353276 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be01fbb6-f686-41d2-aaa3-1abd80d94c27-serving-cert\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.353513 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.353494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d3a65e-5e28-4860-a01a-277b576a947b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.374867 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.374848 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjkf\" (UniqueName: \"kubernetes.io/projected/be01fbb6-f686-41d2-aaa3-1abd80d94c27-kube-api-access-qqjkf\") pod \"insights-operator-585dfdc468-r7jtg\" (UID: \"be01fbb6-f686-41d2-aaa3-1abd80d94c27\") " pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.378968 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.378949 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsh7d\" (UniqueName: \"kubernetes.io/projected/c5d3a65e-5e28-4860-a01a-277b576a947b-kube-api-access-dsh7d\") pod \"kube-storage-version-migrator-operator-6769c5d45-994j7\" (UID: \"c5d3a65e-5e28-4860-a01a-277b576a947b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.452231 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.452194 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d142be23-d04a-4d93-a53c-ca2d3e8cd743-serving-cert\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.452461 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.452318 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d142be23-d04a-4d93-a53c-ca2d3e8cd743-trusted-ca\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.452461 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.452372 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d142be23-d04a-4d93-a53c-ca2d3e8cd743-config\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.452461 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.452397 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqw5\" (UniqueName: \"kubernetes.io/projected/d142be23-d04a-4d93-a53c-ca2d3e8cd743-kube-api-access-8fqw5\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.452461 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.452445 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9pc\" (UniqueName: \"kubernetes.io/projected/c83bbc6f-da37-44eb-9045-416521b41bcf-kube-api-access-lm9pc\") pod \"network-check-source-8894fc9bd-9hksj\" (UID: \"c83bbc6f-da37-44eb-9045-416521b41bcf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" Apr 21 16:03:52.453676 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.453657 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl"] Apr 21 16:03:52.456480 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:52.456455 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1504ffc_d02c_419c_92a1_e0f7dbab1932.slice/crio-8121150753f1583e7d3b93b28237cdd645d3f169d9433f3d736f12c481bc3e3a WatchSource:0}: Error finding container 8121150753f1583e7d3b93b28237cdd645d3f169d9433f3d736f12c481bc3e3a: Status 404 returned error can't find the container with id 8121150753f1583e7d3b93b28237cdd645d3f169d9433f3d736f12c481bc3e3a Apr 21 16:03:52.483770 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.483749 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" Apr 21 16:03:52.489291 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.489272 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" Apr 21 16:03:52.553541 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.553503 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d142be23-d04a-4d93-a53c-ca2d3e8cd743-config\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.553677 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.553545 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqw5\" (UniqueName: \"kubernetes.io/projected/d142be23-d04a-4d93-a53c-ca2d3e8cd743-kube-api-access-8fqw5\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.553745 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.553711 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9pc\" (UniqueName: \"kubernetes.io/projected/c83bbc6f-da37-44eb-9045-416521b41bcf-kube-api-access-lm9pc\") pod \"network-check-source-8894fc9bd-9hksj\" (UID: \"c83bbc6f-da37-44eb-9045-416521b41bcf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" Apr 21 16:03:52.553812 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.553767 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d142be23-d04a-4d93-a53c-ca2d3e8cd743-serving-cert\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.553887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.553871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d142be23-d04a-4d93-a53c-ca2d3e8cd743-trusted-ca\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.554288 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.554259 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d142be23-d04a-4d93-a53c-ca2d3e8cd743-config\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.555099 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.555076 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d142be23-d04a-4d93-a53c-ca2d3e8cd743-trusted-ca\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.556007 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.555988 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d142be23-d04a-4d93-a53c-ca2d3e8cd743-serving-cert\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.563675 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.563654 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9pc\" (UniqueName: \"kubernetes.io/projected/c83bbc6f-da37-44eb-9045-416521b41bcf-kube-api-access-lm9pc\") pod \"network-check-source-8894fc9bd-9hksj\" (UID: \"c83bbc6f-da37-44eb-9045-416521b41bcf\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" Apr 21 16:03:52.564405 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.564382 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqw5\" (UniqueName: \"kubernetes.io/projected/d142be23-d04a-4d93-a53c-ca2d3e8cd743-kube-api-access-8fqw5\") pod \"console-operator-9d4b6777b-hg85z\" (UID: \"d142be23-d04a-4d93-a53c-ca2d3e8cd743\") " pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.604208 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.604183 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" Apr 21 16:03:52.606626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.606604 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r7jtg"] Apr 21 16:03:52.609283 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:52.609254 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe01fbb6_f686_41d2_aaa3_1abd80d94c27.slice/crio-3582b0ff4d4ea5eb9707b7167fec6418cd930927acf5e9ef2f03a8c264740430 WatchSource:0}: Error finding container 3582b0ff4d4ea5eb9707b7167fec6418cd930927acf5e9ef2f03a8c264740430: Status 404 returned error can't find the container with id 3582b0ff4d4ea5eb9707b7167fec6418cd930927acf5e9ef2f03a8c264740430 Apr 21 16:03:52.609401 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.609385 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:03:52.623809 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.623729 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7"] Apr 21 16:03:52.627947 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:52.627919 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d3a65e_5e28_4860_a01a_277b576a947b.slice/crio-c94570fbbd6ad96d5c19bd4986617524e834479804ba028099c5f4d773362274 WatchSource:0}: Error finding container c94570fbbd6ad96d5c19bd4986617524e834479804ba028099c5f4d773362274: Status 404 returned error can't find the container with id c94570fbbd6ad96d5c19bd4986617524e834479804ba028099c5f4d773362274 Apr 21 16:03:52.748548 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.748517 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj"] Apr 21 16:03:52.750946 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:52.750918 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83bbc6f_da37_44eb_9045_416521b41bcf.slice/crio-13a6c3e1c2ec630d8b9e1b09fa48d6e43d4744596b6ad2583d43092dad4bc0af WatchSource:0}: Error finding container 13a6c3e1c2ec630d8b9e1b09fa48d6e43d4744596b6ad2583d43092dad4bc0af: Status 404 returned error can't find the container with id 13a6c3e1c2ec630d8b9e1b09fa48d6e43d4744596b6ad2583d43092dad4bc0af Apr 21 16:03:52.769740 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:52.769716 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hg85z"] Apr 21 16:03:52.774653 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:52.774631 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd142be23_d04a_4d93_a53c_ca2d3e8cd743.slice/crio-d79a1048c890d4aa8c801448eaf41240d08915982de9bd2e5a91e1c7cb5daa82 WatchSource:0}: Error finding container d79a1048c890d4aa8c801448eaf41240d08915982de9bd2e5a91e1c7cb5daa82: Status 404 returned error can't find the container with id d79a1048c890d4aa8c801448eaf41240d08915982de9bd2e5a91e1c7cb5daa82 Apr 21 16:03:53.251109 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.251056 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" event={"ID":"be01fbb6-f686-41d2-aaa3-1abd80d94c27","Type":"ContainerStarted","Data":"3582b0ff4d4ea5eb9707b7167fec6418cd930927acf5e9ef2f03a8c264740430"} Apr 21 16:03:53.253067 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.253039 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" event={"ID":"c83bbc6f-da37-44eb-9045-416521b41bcf","Type":"ContainerStarted","Data":"2759461a79a29bfd1ae7d6c43578149c41f1bc89de37e6c3cac9fe564ccf56c2"} Apr 21 16:03:53.253185 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.253074 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" event={"ID":"c83bbc6f-da37-44eb-9045-416521b41bcf","Type":"ContainerStarted","Data":"13a6c3e1c2ec630d8b9e1b09fa48d6e43d4744596b6ad2583d43092dad4bc0af"} Apr 21 16:03:53.254864 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.254828 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" event={"ID":"c5d3a65e-5e28-4860-a01a-277b576a947b","Type":"ContainerStarted","Data":"c94570fbbd6ad96d5c19bd4986617524e834479804ba028099c5f4d773362274"} Apr 21 16:03:53.256389 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.256364 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" event={"ID":"d142be23-d04a-4d93-a53c-ca2d3e8cd743","Type":"ContainerStarted","Data":"d79a1048c890d4aa8c801448eaf41240d08915982de9bd2e5a91e1c7cb5daa82"} Apr 21 16:03:53.257572 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.257547 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" event={"ID":"f1504ffc-d02c-419c-92a1-e0f7dbab1932","Type":"ContainerStarted","Data":"8121150753f1583e7d3b93b28237cdd645d3f169d9433f3d736f12c481bc3e3a"} Apr 21 16:03:53.287857 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:53.286518 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hksj" podStartSLOduration=1.286501386 podStartE2EDuration="1.286501386s" podCreationTimestamp="2026-04-21 16:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:03:53.286429752 +0000 UTC m=+125.031367558" watchObservedRunningTime="2026-04-21 16:03:53.286501386 +0000 UTC m=+125.031439192" Apr 21 16:03:56.269833 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.269773 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" event={"ID":"c5d3a65e-5e28-4860-a01a-277b576a947b","Type":"ContainerStarted","Data":"54dcee25c6fd6b812d1cdace6e1039ab26e4dfd2c44ae7d8773ddd422e909600"} Apr 21 16:03:56.271310 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.271291 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/0.log" Apr 21 16:03:56.271426 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.271350 2562 generic.go:358] "Generic (PLEG): container finished" podID="d142be23-d04a-4d93-a53c-ca2d3e8cd743" containerID="c268c9878a200450bfd5aaf364792b715496c689dd8b614fe0b6d2be50a2c18f" exitCode=255 Apr 21 16:03:56.271426 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.271410 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" event={"ID":"d142be23-d04a-4d93-a53c-ca2d3e8cd743","Type":"ContainerDied","Data":"c268c9878a200450bfd5aaf364792b715496c689dd8b614fe0b6d2be50a2c18f"} Apr 21 16:03:56.271645 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.271623 2562 scope.go:117] "RemoveContainer" containerID="c268c9878a200450bfd5aaf364792b715496c689dd8b614fe0b6d2be50a2c18f" Apr 21 16:03:56.272848 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.272820 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" event={"ID":"f1504ffc-d02c-419c-92a1-e0f7dbab1932","Type":"ContainerStarted","Data":"1e9bae363bc3354300082bd8fb1fe73f9c19e1877abcac55183d946c4e25e028"} Apr 21 16:03:56.274202 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.274169 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" event={"ID":"be01fbb6-f686-41d2-aaa3-1abd80d94c27","Type":"ContainerStarted","Data":"ad0843344a58ad3a93df68e4e7c1623a9a7ac0db80e5a83ba97251d68f5d9091"} Apr 21 16:03:56.300881 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.300843 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" podStartSLOduration=1.195865676 podStartE2EDuration="4.300832565s" podCreationTimestamp="2026-04-21 16:03:52 +0000 UTC" firstStartedPulling="2026-04-21 16:03:52.63010032 +0000 UTC m=+124.375038109" lastFinishedPulling="2026-04-21 16:03:55.735067201 +0000 UTC m=+127.480004998" observedRunningTime="2026-04-21 16:03:56.300769792 +0000 UTC m=+128.045707598" watchObservedRunningTime="2026-04-21 16:03:56.300832565 +0000 UTC m=+128.045770369" Apr 21 16:03:56.329609 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.329570 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qwrxl" podStartSLOduration=3.543067379 podStartE2EDuration="5.329558342s" podCreationTimestamp="2026-04-21 16:03:51 +0000 UTC" firstStartedPulling="2026-04-21 16:03:52.458158303 +0000 UTC m=+124.203096087" lastFinishedPulling="2026-04-21 16:03:54.244649267 +0000 UTC m=+125.989587050" observedRunningTime="2026-04-21 16:03:56.329002322 +0000 UTC m=+128.073940127" watchObservedRunningTime="2026-04-21 16:03:56.329558342 +0000 UTC m=+128.074496194" Apr 21 16:03:56.424241 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:56.424193 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" podStartSLOduration=1.305981051 podStartE2EDuration="4.424179201s" podCreationTimestamp="2026-04-21 16:03:52 +0000 UTC" firstStartedPulling="2026-04-21 16:03:52.611250367 +0000 UTC m=+124.356188153" lastFinishedPulling="2026-04-21 16:03:55.729448508 +0000 UTC m=+127.474386303" observedRunningTime="2026-04-21 16:03:56.423535871 +0000 UTC m=+128.168473677" watchObservedRunningTime="2026-04-21 16:03:56.424179201 +0000 UTC m=+128.169117071" Apr 21 16:03:57.279130 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.279096 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:03:57.279573 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.279557 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/0.log" Apr 21 16:03:57.279626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.279592 2562 generic.go:358] "Generic (PLEG): container finished" podID="d142be23-d04a-4d93-a53c-ca2d3e8cd743" containerID="6072051082fdca3a0c1cd19e6d24244aa5684bf9c8a0e55741f5b532913ad479" exitCode=255 Apr 21 16:03:57.279736 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.279706 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" event={"ID":"d142be23-d04a-4d93-a53c-ca2d3e8cd743","Type":"ContainerDied","Data":"6072051082fdca3a0c1cd19e6d24244aa5684bf9c8a0e55741f5b532913ad479"} Apr 21 16:03:57.279899 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.279753 2562 scope.go:117] "RemoveContainer" containerID="c268c9878a200450bfd5aaf364792b715496c689dd8b614fe0b6d2be50a2c18f" Apr 21 16:03:57.280020 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.280002 2562 scope.go:117] "RemoveContainer" containerID="6072051082fdca3a0c1cd19e6d24244aa5684bf9c8a0e55741f5b532913ad479" Apr 21 16:03:57.280252 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:57.280230 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hg85z_openshift-console-operator(d142be23-d04a-4d93-a53c-ca2d3e8cd743)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" podUID="d142be23-d04a-4d93-a53c-ca2d3e8cd743" Apr 21 16:03:57.458294 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.458262 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h"] Apr 21 16:03:57.460545 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.460529 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" Apr 21 16:03:57.464334 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.464315 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 16:03:57.466916 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.466597 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6bqkx\"" Apr 21 16:03:57.467683 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.467663 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 16:03:57.484140 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.484116 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h"] Apr 21 16:03:57.596398 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.596339 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmj9\" (UniqueName: \"kubernetes.io/projected/72e648ce-5738-4b50-b6fe-add8dfdcb823-kube-api-access-gzmj9\") pod \"migrator-74bb7799d9-bqg6h\" (UID: \"72e648ce-5738-4b50-b6fe-add8dfdcb823\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" Apr 21 16:03:57.697165 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.697141 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzmj9\" (UniqueName: \"kubernetes.io/projected/72e648ce-5738-4b50-b6fe-add8dfdcb823-kube-api-access-gzmj9\") pod \"migrator-74bb7799d9-bqg6h\" (UID: \"72e648ce-5738-4b50-b6fe-add8dfdcb823\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" Apr 21 16:03:57.719068 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.719039 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzmj9\" (UniqueName: \"kubernetes.io/projected/72e648ce-5738-4b50-b6fe-add8dfdcb823-kube-api-access-gzmj9\") pod \"migrator-74bb7799d9-bqg6h\" (UID: \"72e648ce-5738-4b50-b6fe-add8dfdcb823\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" Apr 21 16:03:57.769229 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.769199 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" Apr 21 16:03:57.895610 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:57.895544 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h"] Apr 21 16:03:57.898720 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:57.898685 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e648ce_5738_4b50_b6fe_add8dfdcb823.slice/crio-ff0117697ce11f28f8e8d8e1bc3cf9c51e0b343c0d81df3b6438dffef77a84f4 WatchSource:0}: Error finding container ff0117697ce11f28f8e8d8e1bc3cf9c51e0b343c0d81df3b6438dffef77a84f4: Status 404 returned error can't find the container with id ff0117697ce11f28f8e8d8e1bc3cf9c51e0b343c0d81df3b6438dffef77a84f4 Apr 21 16:03:58.201195 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.201124 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5949g"] Apr 21 16:03:58.203614 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.203598 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.223948 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.223929 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 16:03:58.224632 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.224617 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 16:03:58.227858 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.227842 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-cztpt\"" Apr 21 16:03:58.229623 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.229610 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 16:03:58.234522 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.234506 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 16:03:58.253406 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.253379 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5949g"] Apr 21 16:03:58.283575 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.283551 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" event={"ID":"72e648ce-5738-4b50-b6fe-add8dfdcb823","Type":"ContainerStarted","Data":"ff0117697ce11f28f8e8d8e1bc3cf9c51e0b343c0d81df3b6438dffef77a84f4"} Apr 21 16:03:58.284823 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.284807 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:03:58.285129 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.285116 2562 scope.go:117] "RemoveContainer" containerID="6072051082fdca3a0c1cd19e6d24244aa5684bf9c8a0e55741f5b532913ad479" Apr 21 16:03:58.285277 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:58.285262 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hg85z_openshift-console-operator(d142be23-d04a-4d93-a53c-ca2d3e8cd743)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" podUID="d142be23-d04a-4d93-a53c-ca2d3e8cd743" Apr 21 16:03:58.300744 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.300721 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxrf\" (UniqueName: \"kubernetes.io/projected/955977ef-3b78-4de3-ae8c-0dd086197eec-kube-api-access-fhxrf\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.300840 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.300746 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/955977ef-3b78-4de3-ae8c-0dd086197eec-signing-key\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.300840 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.300763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/955977ef-3b78-4de3-ae8c-0dd086197eec-signing-cabundle\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.402137 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.402110 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxrf\" (UniqueName: \"kubernetes.io/projected/955977ef-3b78-4de3-ae8c-0dd086197eec-kube-api-access-fhxrf\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.402137 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.402139 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/955977ef-3b78-4de3-ae8c-0dd086197eec-signing-key\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.402307 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.402156 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/955977ef-3b78-4de3-ae8c-0dd086197eec-signing-cabundle\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.403157 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.403122 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/955977ef-3b78-4de3-ae8c-0dd086197eec-signing-cabundle\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.405053 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.405032 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/955977ef-3b78-4de3-ae8c-0dd086197eec-signing-key\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.417081 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.417057 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxrf\" (UniqueName: \"kubernetes.io/projected/955977ef-3b78-4de3-ae8c-0dd086197eec-kube-api-access-fhxrf\") pod \"service-ca-865cb79987-5949g\" (UID: \"955977ef-3b78-4de3-ae8c-0dd086197eec\") " pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.512399 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.512374 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5949g" Apr 21 16:03:58.604669 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.604633 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:03:58.605149 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:58.604892 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:03:58.605149 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:03:58.604970 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs podName:e022d7cd-e433-4f58-8b33-7c830d23f95c nodeName:}" failed. No retries permitted until 2026-04-21 16:06:00.60494812 +0000 UTC m=+252.349885904 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs") pod "network-metrics-daemon-rg8v9" (UID: "e022d7cd-e433-4f58-8b33-7c830d23f95c") : secret "metrics-daemon-secret" not found Apr 21 16:03:58.654641 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:58.654577 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5949g"] Apr 21 16:03:58.657817 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:03:58.657775 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955977ef_3b78_4de3_ae8c_0dd086197eec.slice/crio-20a0ae7dc976bf3a35886bbfb0fb34b1efdeed8cc17daf265f2d19a91dc863d3 WatchSource:0}: Error finding container 20a0ae7dc976bf3a35886bbfb0fb34b1efdeed8cc17daf265f2d19a91dc863d3: Status 404 returned error can't find the container with id 20a0ae7dc976bf3a35886bbfb0fb34b1efdeed8cc17daf265f2d19a91dc863d3 Apr 21 16:03:59.171508 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:59.171477 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dmnb_1a05e8cf-847c-48cc-802b-171bcb5dea76/dns-node-resolver/0.log" Apr 21 16:03:59.289804 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:59.289681 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" event={"ID":"72e648ce-5738-4b50-b6fe-add8dfdcb823","Type":"ContainerStarted","Data":"1f166cad296dab40de1f2d731894bbcfd8c8bc04405dc92b9dff00fa49946116"} Apr 21 16:03:59.289804 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:59.289723 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" event={"ID":"72e648ce-5738-4b50-b6fe-add8dfdcb823","Type":"ContainerStarted","Data":"405a3335b69233549d8090b8836ee14700fd524c95d5d5f5aa1444c696742b1c"} Apr 21 16:03:59.290715 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:59.290673 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5949g" event={"ID":"955977ef-3b78-4de3-ae8c-0dd086197eec","Type":"ContainerStarted","Data":"20a0ae7dc976bf3a35886bbfb0fb34b1efdeed8cc17daf265f2d19a91dc863d3"} Apr 21 16:03:59.337815 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:03:59.337753 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-bqg6h" podStartSLOduration=1.210796334 podStartE2EDuration="2.337741442s" podCreationTimestamp="2026-04-21 16:03:57 +0000 UTC" firstStartedPulling="2026-04-21 16:03:57.900462467 +0000 UTC m=+129.645400250" lastFinishedPulling="2026-04-21 16:03:59.02740757 +0000 UTC m=+130.772345358" observedRunningTime="2026-04-21 16:03:59.333804459 +0000 UTC m=+131.078742259" watchObservedRunningTime="2026-04-21 16:03:59.337741442 +0000 UTC m=+131.082679248" Apr 21 16:04:00.172654 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:00.172624 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-chrww_511124f1-f198-4d6c-9713-d6f1375957e5/node-ca/0.log" Apr 21 16:04:01.299575 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:01.299536 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5949g" event={"ID":"955977ef-3b78-4de3-ae8c-0dd086197eec","Type":"ContainerStarted","Data":"6a7c8a0587fe242a120f284b796dfecdad74a09f8538f8a455e20b540e52c77f"} Apr 21 16:04:02.610232 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:02.610192 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:04:02.610232 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:02.610239 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:04:02.610721 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:02.610675 2562 scope.go:117] "RemoveContainer" containerID="6072051082fdca3a0c1cd19e6d24244aa5684bf9c8a0e55741f5b532913ad479" Apr 21 16:04:02.610914 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:04:02.610890 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hg85z_openshift-console-operator(d142be23-d04a-4d93-a53c-ca2d3e8cd743)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" podUID="d142be23-d04a-4d93-a53c-ca2d3e8cd743" Apr 21 16:04:16.906698 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:16.906662 2562 scope.go:117] "RemoveContainer" containerID="6072051082fdca3a0c1cd19e6d24244aa5684bf9c8a0e55741f5b532913ad479" Apr 21 16:04:17.341832 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:17.341802 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:04:17.341990 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:17.341866 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" event={"ID":"d142be23-d04a-4d93-a53c-ca2d3e8cd743","Type":"ContainerStarted","Data":"43642bd293c0dcae2f1d3132d732e4b5a43e4cf13230a0ef0e6933e81732b62a"} Apr 21 16:04:17.342208 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:17.342185 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:04:17.366123 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:17.366081 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5949g" podStartSLOduration=17.760640141 podStartE2EDuration="19.366069834s" podCreationTimestamp="2026-04-21 16:03:58 +0000 UTC" firstStartedPulling="2026-04-21 16:03:58.659689184 +0000 UTC m=+130.404626970" lastFinishedPulling="2026-04-21 16:04:00.265118877 +0000 UTC m=+132.010056663" observedRunningTime="2026-04-21 16:04:01.354101546 +0000 UTC m=+133.099039351" watchObservedRunningTime="2026-04-21 16:04:17.366069834 +0000 UTC m=+149.111007638" Apr 21 16:04:17.366339 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:17.366308 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" podStartSLOduration=22.407452526 podStartE2EDuration="25.366302107s" podCreationTimestamp="2026-04-21 16:03:52 +0000 UTC" firstStartedPulling="2026-04-21 16:03:52.776376488 +0000 UTC m=+124.521314271" lastFinishedPulling="2026-04-21 16:03:55.735226069 +0000 UTC m=+127.480163852" observedRunningTime="2026-04-21 16:04:17.365024824 +0000 UTC m=+149.109962623" watchObservedRunningTime="2026-04-21 16:04:17.366302107 +0000 UTC m=+149.111239911" Apr 21 16:04:17.546927 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:17.546897 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-hg85z" Apr 21 16:04:25.162348 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:04:25.162303 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lxgkn" podUID="8648a8db-b8ad-409e-ae80-85c058398baf" Apr 21 16:04:25.193726 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:04:25.193692 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p8j5k" podUID="50caee65-e2ab-4233-a2b5-e5ea4a951bed" Apr 21 16:04:25.366242 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:25.363987 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:04:25.929363 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:04:25.929326 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rg8v9" podUID="e022d7cd-e433-4f58-8b33-7c830d23f95c" Apr 21 16:04:26.958459 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.958429 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d"] Apr 21 16:04:26.960936 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.960916 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:26.962329 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.962305 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-sstpk"] Apr 21 16:04:26.964179 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.964158 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:26.964393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.964371 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 16:04:26.964630 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.964617 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-lmmkq\"" Apr 21 16:04:26.966974 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.966953 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 16:04:26.968087 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.968072 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 16:04:26.976464 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.976446 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qpmqj\"" Apr 21 16:04:26.990514 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.990496 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-sstpk"] Apr 21 16:04:26.992187 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:26.992166 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d"] Apr 21 16:04:27.007428 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.007405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a2c7f21d-b221-4b60-8736-1cf4fb90d7eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8b82d\" (UID: \"a2c7f21d-b221-4b60-8736-1cf4fb90d7eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:27.007589 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.007444 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fkj4\" (UniqueName: \"kubernetes.io/projected/05d9b5c2-543c-4bc0-a92a-c8433467bc7a-kube-api-access-2fkj4\") pod \"downloads-6bcc868b7-sstpk\" (UID: \"05d9b5c2-543c-4bc0-a92a-c8433467bc7a\") " pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:27.065458 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.065431 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-h4qvj"] Apr 21 16:04:27.067869 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.067854 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.075645 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.075620 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rdjm8\"" Apr 21 16:04:27.075739 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.075652 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 16:04:27.075990 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.075971 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 16:04:27.087200 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.087174 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h4qvj"] Apr 21 16:04:27.108002 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.107976 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msw69\" (UniqueName: \"kubernetes.io/projected/d84cf002-59f6-43e4-991d-f3cae3707de3-kube-api-access-msw69\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.108104 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.108015 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a2c7f21d-b221-4b60-8736-1cf4fb90d7eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8b82d\" (UID: \"a2c7f21d-b221-4b60-8736-1cf4fb90d7eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:27.108104 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.108047 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d84cf002-59f6-43e4-991d-f3cae3707de3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.108104 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.108083 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d84cf002-59f6-43e4-991d-f3cae3707de3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.108227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.108173 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d84cf002-59f6-43e4-991d-f3cae3707de3-data-volume\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.108227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.108203 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fkj4\" (UniqueName: \"kubernetes.io/projected/05d9b5c2-543c-4bc0-a92a-c8433467bc7a-kube-api-access-2fkj4\") pod \"downloads-6bcc868b7-sstpk\" (UID: \"05d9b5c2-543c-4bc0-a92a-c8433467bc7a\") " pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:27.108308 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.108226 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d84cf002-59f6-43e4-991d-f3cae3707de3-crio-socket\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.110318 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.110299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a2c7f21d-b221-4b60-8736-1cf4fb90d7eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8b82d\" (UID: \"a2c7f21d-b221-4b60-8736-1cf4fb90d7eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:27.133772 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.133747 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fkj4\" (UniqueName: \"kubernetes.io/projected/05d9b5c2-543c-4bc0-a92a-c8433467bc7a-kube-api-access-2fkj4\") pod \"downloads-6bcc868b7-sstpk\" (UID: \"05d9b5c2-543c-4bc0-a92a-c8433467bc7a\") " pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:27.191275 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.191246 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d5497b59c-rnrbh"] Apr 21 16:04:27.193207 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.193191 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.197662 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.197640 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 16:04:27.197662 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.197653 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 16:04:27.197832 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.197740 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 16:04:27.198268 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.198253 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vqm4s\"" Apr 21 16:04:27.204451 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.204432 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 16:04:27.209143 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209099 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-registry-tls\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209143 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209135 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d84cf002-59f6-43e4-991d-f3cae3707de3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209241 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209162 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d84cf002-59f6-43e4-991d-f3cae3707de3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209296 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209276 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-bound-sa-token\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209331 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209319 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7cca86f-2296-47d4-9cc4-403c90fded3d-ca-trust-extracted\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209370 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d84cf002-59f6-43e4-991d-f3cae3707de3-data-volume\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209414 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209388 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7cca86f-2296-47d4-9cc4-403c90fded3d-installation-pull-secrets\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209453 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209420 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d84cf002-59f6-43e4-991d-f3cae3707de3-crio-socket\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209500 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209455 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7cca86f-2296-47d4-9cc4-403c90fded3d-registry-certificates\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209569 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209547 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnt8\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-kube-api-access-tvnt8\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209641 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209626 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a7cca86f-2296-47d4-9cc4-403c90fded3d-image-registry-private-configuration\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.209702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209638 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d84cf002-59f6-43e4-991d-f3cae3707de3-data-volume\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209552 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d84cf002-59f6-43e4-991d-f3cae3707de3-crio-socket\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209676 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msw69\" (UniqueName: \"kubernetes.io/projected/d84cf002-59f6-43e4-991d-f3cae3707de3-kube-api-access-msw69\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209682 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d84cf002-59f6-43e4-991d-f3cae3707de3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.209919 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.209703 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7cca86f-2296-47d4-9cc4-403c90fded3d-trusted-ca\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.211501 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.211475 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d84cf002-59f6-43e4-991d-f3cae3707de3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.213504 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.213482 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5497b59c-rnrbh"] Apr 21 16:04:27.236732 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.236705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msw69\" (UniqueName: \"kubernetes.io/projected/d84cf002-59f6-43e4-991d-f3cae3707de3-kube-api-access-msw69\") pod \"insights-runtime-extractor-h4qvj\" (UID: \"d84cf002-59f6-43e4-991d-f3cae3707de3\") " pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.271492 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.271474 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:27.276149 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.276132 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:27.310311 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310281 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a7cca86f-2296-47d4-9cc4-403c90fded3d-image-registry-private-configuration\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.310434 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310330 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7cca86f-2296-47d4-9cc4-403c90fded3d-trusted-ca\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.310528 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310507 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-registry-tls\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.310769 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310700 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-bound-sa-token\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.310886 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310817 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7cca86f-2296-47d4-9cc4-403c90fded3d-ca-trust-extracted\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.310946 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310895 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7cca86f-2296-47d4-9cc4-403c90fded3d-installation-pull-secrets\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.311001 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.310965 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7cca86f-2296-47d4-9cc4-403c90fded3d-registry-certificates\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.311054 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.311032 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvnt8\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-kube-api-access-tvnt8\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.311242 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.311215 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a7cca86f-2296-47d4-9cc4-403c90fded3d-ca-trust-extracted\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.311469 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.311444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a7cca86f-2296-47d4-9cc4-403c90fded3d-trusted-ca\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.311814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.311763 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a7cca86f-2296-47d4-9cc4-403c90fded3d-registry-certificates\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.313764 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.313744 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a7cca86f-2296-47d4-9cc4-403c90fded3d-installation-pull-secrets\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.313921 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.313899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a7cca86f-2296-47d4-9cc4-403c90fded3d-image-registry-private-configuration\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.314058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.314038 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-registry-tls\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.327720 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.327672 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-bound-sa-token\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.337566 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.337403 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvnt8\" (UniqueName: \"kubernetes.io/projected/a7cca86f-2296-47d4-9cc4-403c90fded3d-kube-api-access-tvnt8\") pod \"image-registry-d5497b59c-rnrbh\" (UID: \"a7cca86f-2296-47d4-9cc4-403c90fded3d\") " pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.376777 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.376750 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-h4qvj" Apr 21 16:04:27.439866 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.439840 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d"] Apr 21 16:04:27.441157 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:27.441130 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c7f21d_b221_4b60_8736_1cf4fb90d7eb.slice/crio-c0583f9bf91be7e98613044754bb753c10df8a619009bca657b9a5608dc1578e WatchSource:0}: Error finding container c0583f9bf91be7e98613044754bb753c10df8a619009bca657b9a5608dc1578e: Status 404 returned error can't find the container with id c0583f9bf91be7e98613044754bb753c10df8a619009bca657b9a5608dc1578e Apr 21 16:04:27.471003 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:27.470976 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05d9b5c2_543c_4bc0_a92a_c8433467bc7a.slice/crio-f540e4351800ace5979002bd4ac80d23e4f29516c7c6914375c0a2cfe4d296b9 WatchSource:0}: Error finding container f540e4351800ace5979002bd4ac80d23e4f29516c7c6914375c0a2cfe4d296b9: Status 404 returned error can't find the container with id f540e4351800ace5979002bd4ac80d23e4f29516c7c6914375c0a2cfe4d296b9 Apr 21 16:04:27.473033 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.473007 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-sstpk"] Apr 21 16:04:27.502454 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.502427 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:27.538222 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.538194 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-h4qvj"] Apr 21 16:04:27.541571 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:27.541548 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd84cf002_59f6_43e4_991d_f3cae3707de3.slice/crio-13860d3c66806905ac0cf6cd433110fcc8b7e0d838a1d6e0347b8ffc90128565 WatchSource:0}: Error finding container 13860d3c66806905ac0cf6cd433110fcc8b7e0d838a1d6e0347b8ffc90128565: Status 404 returned error can't find the container with id 13860d3c66806905ac0cf6cd433110fcc8b7e0d838a1d6e0347b8ffc90128565 Apr 21 16:04:27.643669 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:27.643641 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5497b59c-rnrbh"] Apr 21 16:04:27.646214 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:27.646188 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7cca86f_2296_47d4_9cc4_403c90fded3d.slice/crio-d5ed57c397d33bfac88b5c4dc37e85f8206dccb8e19fcfef95b28a0350cd60f2 WatchSource:0}: Error finding container d5ed57c397d33bfac88b5c4dc37e85f8206dccb8e19fcfef95b28a0350cd60f2: Status 404 returned error can't find the container with id d5ed57c397d33bfac88b5c4dc37e85f8206dccb8e19fcfef95b28a0350cd60f2 Apr 21 16:04:28.374421 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.374382 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qvj" event={"ID":"d84cf002-59f6-43e4-991d-f3cae3707de3","Type":"ContainerStarted","Data":"83925ad503070e2cb99d7dfeb5aada9dbe20c43f0cae73dd481feff1401f89a9"} Apr 21 16:04:28.374421 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.374430 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qvj" event={"ID":"d84cf002-59f6-43e4-991d-f3cae3707de3","Type":"ContainerStarted","Data":"aa602104e58c99831b0a69a5b6747e584caac498d0c6d041737ee91c16bc7934"} Apr 21 16:04:28.374939 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.374445 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qvj" event={"ID":"d84cf002-59f6-43e4-991d-f3cae3707de3","Type":"ContainerStarted","Data":"13860d3c66806905ac0cf6cd433110fcc8b7e0d838a1d6e0347b8ffc90128565"} Apr 21 16:04:28.375713 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.375685 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-sstpk" event={"ID":"05d9b5c2-543c-4bc0-a92a-c8433467bc7a","Type":"ContainerStarted","Data":"f540e4351800ace5979002bd4ac80d23e4f29516c7c6914375c0a2cfe4d296b9"} Apr 21 16:04:28.377388 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.377357 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" event={"ID":"a7cca86f-2296-47d4-9cc4-403c90fded3d","Type":"ContainerStarted","Data":"fe3c2845b70c64f859e7b05361704531eb3876f5c45b0874672b99236b5d5193"} Apr 21 16:04:28.377481 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.377392 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" event={"ID":"a7cca86f-2296-47d4-9cc4-403c90fded3d","Type":"ContainerStarted","Data":"d5ed57c397d33bfac88b5c4dc37e85f8206dccb8e19fcfef95b28a0350cd60f2"} Apr 21 16:04:28.377542 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.377489 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:28.378575 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.378548 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" event={"ID":"a2c7f21d-b221-4b60-8736-1cf4fb90d7eb","Type":"ContainerStarted","Data":"c0583f9bf91be7e98613044754bb753c10df8a619009bca657b9a5608dc1578e"} Apr 21 16:04:28.423146 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:28.422894 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" podStartSLOduration=1.422878527 podStartE2EDuration="1.422878527s" podCreationTimestamp="2026-04-21 16:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:28.421126745 +0000 UTC m=+160.166064549" watchObservedRunningTime="2026-04-21 16:04:28.422878527 +0000 UTC m=+160.167816310" Apr 21 16:04:29.384148 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:29.384107 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" event={"ID":"a2c7f21d-b221-4b60-8736-1cf4fb90d7eb","Type":"ContainerStarted","Data":"dbd3bcd30a04ee061c3ad3cdc1fa6fe16a43e495b5e0be3e475f3efd073a3c9d"} Apr 21 16:04:29.384601 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:29.384412 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:29.390577 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:29.390528 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" Apr 21 16:04:29.422817 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:29.422728 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8b82d" podStartSLOduration=2.074213176 podStartE2EDuration="3.422714568s" podCreationTimestamp="2026-04-21 16:04:26 +0000 UTC" firstStartedPulling="2026-04-21 16:04:27.443236733 +0000 UTC m=+159.188174517" lastFinishedPulling="2026-04-21 16:04:28.791738113 +0000 UTC m=+160.536675909" observedRunningTime="2026-04-21 16:04:29.421694038 +0000 UTC m=+161.166631843" watchObservedRunningTime="2026-04-21 16:04:29.422714568 +0000 UTC m=+161.167652370" Apr 21 16:04:30.136696 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.136662 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:04:30.136892 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.136749 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:04:30.139163 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.139135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50caee65-e2ab-4233-a2b5-e5ea4a951bed-metrics-tls\") pod \"dns-default-p8j5k\" (UID: \"50caee65-e2ab-4233-a2b5-e5ea4a951bed\") " pod="openshift-dns/dns-default-p8j5k" Apr 21 16:04:30.139393 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.139375 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8648a8db-b8ad-409e-ae80-85c058398baf-cert\") pod \"ingress-canary-lxgkn\" (UID: \"8648a8db-b8ad-409e-ae80-85c058398baf\") " pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:04:30.173337 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.173295 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76wnm\"" Apr 21 16:04:30.175347 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.175329 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lxgkn" Apr 21 16:04:30.322418 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:30.322389 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lxgkn"] Apr 21 16:04:30.397543 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:30.397479 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8648a8db_b8ad_409e_ae80_85c058398baf.slice/crio-257cd0bdc6b2aaea77c886117e10ccf92848bdb34191c0bfe009a26fcc78bde2 WatchSource:0}: Error finding container 257cd0bdc6b2aaea77c886117e10ccf92848bdb34191c0bfe009a26fcc78bde2: Status 404 returned error can't find the container with id 257cd0bdc6b2aaea77c886117e10ccf92848bdb34191c0bfe009a26fcc78bde2 Apr 21 16:04:31.390444 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:31.390407 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lxgkn" event={"ID":"8648a8db-b8ad-409e-ae80-85c058398baf","Type":"ContainerStarted","Data":"257cd0bdc6b2aaea77c886117e10ccf92848bdb34191c0bfe009a26fcc78bde2"} Apr 21 16:04:31.392762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:31.392729 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-h4qvj" event={"ID":"d84cf002-59f6-43e4-991d-f3cae3707de3","Type":"ContainerStarted","Data":"89518765ec6c67d602ca83c0ce3baf50a792533cd07bf4d80ebb6cce2d61e554"} Apr 21 16:04:32.087979 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.087924 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-h4qvj" podStartSLOduration=2.262746187 podStartE2EDuration="5.087902882s" podCreationTimestamp="2026-04-21 16:04:27 +0000 UTC" firstStartedPulling="2026-04-21 16:04:27.605869973 +0000 UTC m=+159.350807756" lastFinishedPulling="2026-04-21 16:04:30.431026663 +0000 UTC m=+162.175964451" observedRunningTime="2026-04-21 16:04:31.439410437 +0000 UTC m=+163.184348261" watchObservedRunningTime="2026-04-21 16:04:32.087902882 +0000 UTC m=+163.832840692" Apr 21 16:04:32.089027 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.089001 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68bfcc5c9f-s2l52"] Apr 21 16:04:32.118645 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.118439 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68bfcc5c9f-s2l52"] Apr 21 16:04:32.118645 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.118576 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.123155 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.123130 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 16:04:32.123559 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.123520 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9tkts\"" Apr 21 16:04:32.123833 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.123815 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 16:04:32.125106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.124532 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 16:04:32.125106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.124567 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 16:04:32.125106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.124820 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 16:04:32.131411 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.131386 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 16:04:32.157581 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157548 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-oauth-config\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.157759 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157615 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-service-ca\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.157759 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157654 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-config\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.157759 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157681 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-serving-cert\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.157981 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157774 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-trusted-ca-bundle\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.157981 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157814 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mt8\" (UniqueName: \"kubernetes.io/projected/29a2cb07-76ea-41df-a7e6-f42b090d7d77-kube-api-access-88mt8\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.157981 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.157843 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-oauth-serving-cert\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259143 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259114 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-trusted-ca-bundle\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259143 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259148 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88mt8\" (UniqueName: \"kubernetes.io/projected/29a2cb07-76ea-41df-a7e6-f42b090d7d77-kube-api-access-88mt8\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259172 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-oauth-serving-cert\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259208 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-oauth-config\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259252 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-service-ca\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259307 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-config\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259332 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-serving-cert\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.259967 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.259918 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-oauth-serving-cert\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.260420 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.260393 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-config\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.260822 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.260803 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-service-ca\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.260999 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.260978 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-trusted-ca-bundle\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.261998 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.261977 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-oauth-config\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.262346 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.262326 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-serving-cert\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.273521 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.273479 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mt8\" (UniqueName: \"kubernetes.io/projected/29a2cb07-76ea-41df-a7e6-f42b090d7d77-kube-api-access-88mt8\") pod \"console-68bfcc5c9f-s2l52\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.397569 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.397484 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lxgkn" event={"ID":"8648a8db-b8ad-409e-ae80-85c058398baf","Type":"ContainerStarted","Data":"bba263f36003875997802b9ad011812f93a6c648f1cd238c4b8e8ea00157a17c"} Apr 21 16:04:32.423898 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.423850 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lxgkn" podStartSLOduration=128.571251406 podStartE2EDuration="2m10.423832315s" podCreationTimestamp="2026-04-21 16:02:22 +0000 UTC" firstStartedPulling="2026-04-21 16:04:30.42607933 +0000 UTC m=+162.171017117" lastFinishedPulling="2026-04-21 16:04:32.278660241 +0000 UTC m=+164.023598026" observedRunningTime="2026-04-21 16:04:32.423509 +0000 UTC m=+164.168446803" watchObservedRunningTime="2026-04-21 16:04:32.423832315 +0000 UTC m=+164.168770120" Apr 21 16:04:32.432361 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.432334 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:32.576562 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:32.576522 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68bfcc5c9f-s2l52"] Apr 21 16:04:32.581333 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:32.581301 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a2cb07_76ea_41df_a7e6_f42b090d7d77.slice/crio-3f08b0f2d1e3488d2292ebf4a10c2aefb80f9d33e1536a8ecc5894c3c696c37c WatchSource:0}: Error finding container 3f08b0f2d1e3488d2292ebf4a10c2aefb80f9d33e1536a8ecc5894c3c696c37c: Status 404 returned error can't find the container with id 3f08b0f2d1e3488d2292ebf4a10c2aefb80f9d33e1536a8ecc5894c3c696c37c Apr 21 16:04:33.401634 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:33.401591 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68bfcc5c9f-s2l52" event={"ID":"29a2cb07-76ea-41df-a7e6-f42b090d7d77","Type":"ContainerStarted","Data":"3f08b0f2d1e3488d2292ebf4a10c2aefb80f9d33e1536a8ecc5894c3c696c37c"} Apr 21 16:04:36.127901 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.127868 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jbm6m"] Apr 21 16:04:36.131640 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.131614 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.139639 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.139616 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 16:04:36.198024 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.197954 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-root\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198024 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198002 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198249 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198046 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-accelerators-collector-config\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198249 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198108 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-tls\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198249 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198135 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72dfee7f-8b16-43e2-860b-0ef8b4a63261-metrics-client-ca\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198249 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198200 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-textfile\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198249 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198225 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxm2f\" (UniqueName: \"kubernetes.io/projected/72dfee7f-8b16-43e2-860b-0ef8b4a63261-kube-api-access-nxm2f\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198491 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198259 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-sys\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.198491 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.198284 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-wtmp\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.226961 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.226750 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 16:04:36.226961 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.226777 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 16:04:36.227473 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.227447 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mt6fm\"" Apr 21 16:04:36.234728 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.234560 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 16:04:36.243392 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.243362 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 16:04:36.254825 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.254805 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 16:04:36.299049 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299027 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-tls\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299169 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299066 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72dfee7f-8b16-43e2-860b-0ef8b4a63261-metrics-client-ca\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299169 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299121 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-textfile\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299169 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299146 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxm2f\" (UniqueName: \"kubernetes.io/projected/72dfee7f-8b16-43e2-860b-0ef8b4a63261-kube-api-access-nxm2f\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299320 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:04:36.299179 2562 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 16:04:36.299320 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:04:36.299247 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-tls podName:72dfee7f-8b16-43e2-860b-0ef8b4a63261 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:36.799226644 +0000 UTC m=+168.544164443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-tls") pod "node-exporter-jbm6m" (UID: "72dfee7f-8b16-43e2-860b-0ef8b4a63261") : secret "node-exporter-tls" not found Apr 21 16:04:36.299443 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299350 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-sys\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299443 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299377 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-wtmp\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299443 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299406 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-root\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299443 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299431 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299629 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299455 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-accelerators-collector-config\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299629 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299503 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-textfile\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299629 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299552 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-wtmp\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299629 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299600 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-sys\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.299994 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.299637 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/72dfee7f-8b16-43e2-860b-0ef8b4a63261-root\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.300058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.300040 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/72dfee7f-8b16-43e2-860b-0ef8b4a63261-metrics-client-ca\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.300118 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.300061 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-accelerators-collector-config\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.302770 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.302500 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.326986 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.326930 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxm2f\" (UniqueName: \"kubernetes.io/projected/72dfee7f-8b16-43e2-860b-0ef8b4a63261-kube-api-access-nxm2f\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.413318 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.413246 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68bfcc5c9f-s2l52" event={"ID":"29a2cb07-76ea-41df-a7e6-f42b090d7d77","Type":"ContainerStarted","Data":"ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811"} Apr 21 16:04:36.586349 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.586291 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68bfcc5c9f-s2l52" podStartSLOduration=1.633153864 podStartE2EDuration="4.58627498s" podCreationTimestamp="2026-04-21 16:04:32 +0000 UTC" firstStartedPulling="2026-04-21 16:04:32.58374311 +0000 UTC m=+164.328680892" lastFinishedPulling="2026-04-21 16:04:35.536864225 +0000 UTC m=+167.281802008" observedRunningTime="2026-04-21 16:04:36.585519545 +0000 UTC m=+168.330457348" watchObservedRunningTime="2026-04-21 16:04:36.58627498 +0000 UTC m=+168.331212785" Apr 21 16:04:36.803800 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.803755 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-tls\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.806387 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.806361 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/72dfee7f-8b16-43e2-860b-0ef8b4a63261-node-exporter-tls\") pod \"node-exporter-jbm6m\" (UID: \"72dfee7f-8b16-43e2-860b-0ef8b4a63261\") " pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:36.906863 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:36.906834 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:04:37.044718 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.044674 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jbm6m" Apr 21 16:04:37.045822 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.045416 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:04:37.048547 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.048528 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.058950 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.058824 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 16:04:37.058950 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.058836 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6hzrx\"" Apr 21 16:04:37.059409 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.059209 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 16:04:37.059409 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.059224 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 16:04:37.059544 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.059526 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 16:04:37.059544 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.059538 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 16:04:37.059669 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.059646 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 16:04:37.059776 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.059654 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 16:04:37.065160 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.065142 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 16:04:37.069073 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.069055 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 16:04:37.078990 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.078112 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:04:37.106211 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106182 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106332 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106332 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106291 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106429 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-out\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106429 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106429 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106416 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hzz\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-kube-api-access-29hzz\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106561 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106450 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106561 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106472 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-web-config\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106561 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106502 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106614 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-volume\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106650 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106777 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106726 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.106777 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.106752 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207474 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207430 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207482 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-out\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207512 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207537 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29hzz\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-kube-api-access-29hzz\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207569 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207624 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-web-config\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207654 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207704 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-volume\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207728 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207826 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207852 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207881 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.207944 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.207913 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.208613 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.208587 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.209089 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.209022 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.210579 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.210549 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.211429 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.211382 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.211531 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.211481 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.212287 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.212200 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.212287 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.212251 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.212438 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.212288 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.212885 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.212844 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-out\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.212964 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.212882 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.213662 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.213642 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-volume\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.213797 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.213762 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-web-config\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.220570 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.220548 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hzz\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-kube-api-access-29hzz\") pod \"alertmanager-main-0\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:37.360380 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:37.360270 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:04:38.908844 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:38.908810 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p8j5k" Apr 21 16:04:38.911818 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:38.911771 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2l66p\"" Apr 21 16:04:38.919537 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:38.919518 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p8j5k" Apr 21 16:04:40.790353 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.790317 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5bfd5c857b-fhtvj"] Apr 21 16:04:40.793544 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.793519 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.796109 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.796083 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 16:04:40.796315 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.796129 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 16:04:40.796436 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.796421 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 16:04:40.797314 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.797284 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-lszqb\"" Apr 21 16:04:40.797453 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.797360 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ap4ogfij8c1jm\"" Apr 21 16:04:40.797453 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.797376 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 16:04:40.804875 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.804848 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bfd5c857b-fhtvj"] Apr 21 16:04:40.846252 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846223 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-secret-metrics-server-tls\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.846355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846266 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f90b4ba6-859d-43df-8063-4b7311a0faaa-audit-log\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.846355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846298 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90b4ba6-859d-43df-8063-4b7311a0faaa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.846355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846332 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn85n\" (UniqueName: \"kubernetes.io/projected/f90b4ba6-859d-43df-8063-4b7311a0faaa-kube-api-access-gn85n\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.846511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846355 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-client-ca-bundle\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.846511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846389 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f90b4ba6-859d-43df-8063-4b7311a0faaa-metrics-server-audit-profiles\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.846511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.846441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-secret-metrics-server-client-certs\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947347 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947299 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-secret-metrics-server-tls\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947347 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947351 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f90b4ba6-859d-43df-8063-4b7311a0faaa-audit-log\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947596 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947378 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90b4ba6-859d-43df-8063-4b7311a0faaa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947596 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947409 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn85n\" (UniqueName: \"kubernetes.io/projected/f90b4ba6-859d-43df-8063-4b7311a0faaa-kube-api-access-gn85n\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947596 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947457 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-client-ca-bundle\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947596 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947484 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f90b4ba6-859d-43df-8063-4b7311a0faaa-metrics-server-audit-profiles\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.947596 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.947518 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-secret-metrics-server-client-certs\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.948551 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.948526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f90b4ba6-859d-43df-8063-4b7311a0faaa-audit-log\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.949010 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.948981 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90b4ba6-859d-43df-8063-4b7311a0faaa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.949298 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.949272 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f90b4ba6-859d-43df-8063-4b7311a0faaa-metrics-server-audit-profiles\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.950192 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.950172 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-secret-metrics-server-client-certs\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.950414 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.950393 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-secret-metrics-server-tls\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.950872 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.950849 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f90b4ba6-859d-43df-8063-4b7311a0faaa-client-ca-bundle\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:40.959623 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:40.959601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn85n\" (UniqueName: \"kubernetes.io/projected/f90b4ba6-859d-43df-8063-4b7311a0faaa-kube-api-access-gn85n\") pod \"metrics-server-5bfd5c857b-fhtvj\" (UID: \"f90b4ba6-859d-43df-8063-4b7311a0faaa\") " pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:41.105504 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.105422 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:04:41.512180 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.512149 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dd6c6bb55-p2wvg"] Apr 21 16:04:41.515327 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.515292 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.532987 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.532962 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dd6c6bb55-p2wvg"] Apr 21 16:04:41.654311 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654278 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-service-ca\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.654487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654334 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-oauth-serving-cert\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.654487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-oauth-config\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.654487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654412 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-console-config\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.654487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654471 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8zsq\" (UniqueName: \"kubernetes.io/projected/4048a878-2e43-4099-8919-7a59e04e0dc0-kube-api-access-s8zsq\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.654696 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654501 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-trusted-ca-bundle\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.654696 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.654525 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-serving-cert\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.755604 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755566 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8zsq\" (UniqueName: \"kubernetes.io/projected/4048a878-2e43-4099-8919-7a59e04e0dc0-kube-api-access-s8zsq\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.755765 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755609 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-trusted-ca-bundle\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.755765 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755640 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-serving-cert\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.755765 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755696 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-service-ca\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.755765 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755753 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-oauth-serving-cert\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.756020 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755800 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-oauth-config\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.756020 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.755828 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-console-config\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.756628 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.756591 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-trusted-ca-bundle\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.756834 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.756810 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-console-config\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.757043 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.757020 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-oauth-serving-cert\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.758058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.757997 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-service-ca\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.758472 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.758450 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-serving-cert\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.758594 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.758571 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-oauth-config\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.766158 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.766093 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8zsq\" (UniqueName: \"kubernetes.io/projected/4048a878-2e43-4099-8919-7a59e04e0dc0-kube-api-access-s8zsq\") pod \"console-6dd6c6bb55-p2wvg\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:41.826002 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:41.825976 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:42.111488 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.111408 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:04:42.114735 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.114712 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.118414 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.118391 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 16:04:42.118528 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.118390 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 16:04:42.118691 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.118666 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6q6tav1q5175i\"" Apr 21 16:04:42.119073 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119026 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 16:04:42.119073 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119038 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 16:04:42.119465 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119445 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 16:04:42.119635 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119617 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-z42vl\"" Apr 21 16:04:42.119820 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119665 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 16:04:42.119820 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119674 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 16:04:42.119820 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.119698 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 16:04:42.120148 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.120123 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 16:04:42.120255 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.120244 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 16:04:42.120369 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.120334 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 16:04:42.121166 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.121149 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 16:04:42.135095 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.135059 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:04:42.261766 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.261730 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-config-out\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.261957 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.261800 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.261957 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.261929 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262071 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.261974 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262071 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262028 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262076 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262148 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-config\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262172 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262196 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262403 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262226 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262403 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262254 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-web-config\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262403 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262279 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262403 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262401 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8245\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-kube-api-access-q8245\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262433 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262460 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262519 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.262602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.262562 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363554 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8245\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-kube-api-access-q8245\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363589 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363607 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363912 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363912 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363802 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363912 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363831 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-config-out\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.363912 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363864 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363916 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363947 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.363979 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364010 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-config\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364086 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364106 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364108 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364435 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364144 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364435 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364304 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.364861 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.364559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367354 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367103 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-config\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367467 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367361 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367629 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367604 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367732 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367663 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-web-config\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367732 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367701 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367733 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367740 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.367879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367745 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.368074 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.367968 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.369147 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.368491 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.369147 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.368624 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.369147 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.368864 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-config-out\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.369147 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.369043 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.369147 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.369110 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.369845 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.369772 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.370394 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.370370 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-web-config\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.370659 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.370637 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.371021 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.371004 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.384967 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.384944 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8245\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-kube-api-access-q8245\") pod \"prometheus-k8s-0\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.427125 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.427092 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:42.432480 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.432460 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:42.432566 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.432542 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:42.437753 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:42.437735 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:43.439871 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:43.439845 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:04:43.769742 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:43.769703 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dfee7f_8b16_43e2_860b_0ef8b4a63261.slice/crio-85d9cfc3f44a70706f6a8865adc3634a36bc2966e2724e682773015155467c2e WatchSource:0}: Error finding container 85d9cfc3f44a70706f6a8865adc3634a36bc2966e2724e682773015155467c2e: Status 404 returned error can't find the container with id 85d9cfc3f44a70706f6a8865adc3634a36bc2966e2724e682773015155467c2e Apr 21 16:04:43.984241 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:43.983922 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:04:43.986109 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:43.986061 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f383819_ded1_46ea_b736_11b2ae87bba7.slice/crio-2d5c7d67423adef1e9e25c01e95b5c8b11b61b1cdf764ea5c0dd99252f47f670 WatchSource:0}: Error finding container 2d5c7d67423adef1e9e25c01e95b5c8b11b61b1cdf764ea5c0dd99252f47f670: Status 404 returned error can't find the container with id 2d5c7d67423adef1e9e25c01e95b5c8b11b61b1cdf764ea5c0dd99252f47f670 Apr 21 16:04:43.991536 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:43.991472 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:04:43.993962 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:43.993941 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8066f9f3_f9e7_4d74_9413_bf9a14e14d7f.slice/crio-f5226762ce68b9d3a1f1fcac6170e93cac0b9a17b0fb79be390b7c718e2b38c1 WatchSource:0}: Error finding container f5226762ce68b9d3a1f1fcac6170e93cac0b9a17b0fb79be390b7c718e2b38c1: Status 404 returned error can't find the container with id f5226762ce68b9d3a1f1fcac6170e93cac0b9a17b0fb79be390b7c718e2b38c1 Apr 21 16:04:44.040554 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.040533 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dd6c6bb55-p2wvg"] Apr 21 16:04:44.042405 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:44.042378 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4048a878_2e43_4099_8919_7a59e04e0dc0.slice/crio-fef6796b4a880440a2fe6bc3b9b4c6279a14fc0698239f0a67707076f3d94760 WatchSource:0}: Error finding container fef6796b4a880440a2fe6bc3b9b4c6279a14fc0698239f0a67707076f3d94760: Status 404 returned error can't find the container with id fef6796b4a880440a2fe6bc3b9b4c6279a14fc0698239f0a67707076f3d94760 Apr 21 16:04:44.220467 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.220406 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p8j5k"] Apr 21 16:04:44.224010 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:44.223966 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50caee65_e2ab_4233_a2b5_e5ea4a951bed.slice/crio-c497e37a53c13a573523d8a2792c2c6a50eaa7638026b30988cda2d5f44581c3 WatchSource:0}: Error finding container c497e37a53c13a573523d8a2792c2c6a50eaa7638026b30988cda2d5f44581c3: Status 404 returned error can't find the container with id c497e37a53c13a573523d8a2792c2c6a50eaa7638026b30988cda2d5f44581c3 Apr 21 16:04:44.236819 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.236763 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bfd5c857b-fhtvj"] Apr 21 16:04:44.241026 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:04:44.240873 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90b4ba6_859d_43df_8063_4b7311a0faaa.slice/crio-f641b64fddbef345c52cffbf394c34bfcc16bb3f804654675dd8dd8529a57926 WatchSource:0}: Error finding container f641b64fddbef345c52cffbf394c34bfcc16bb3f804654675dd8dd8529a57926: Status 404 returned error can't find the container with id f641b64fddbef345c52cffbf394c34bfcc16bb3f804654675dd8dd8529a57926 Apr 21 16:04:44.440804 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.440695 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p8j5k" event={"ID":"50caee65-e2ab-4233-a2b5-e5ea4a951bed","Type":"ContainerStarted","Data":"c497e37a53c13a573523d8a2792c2c6a50eaa7638026b30988cda2d5f44581c3"} Apr 21 16:04:44.442857 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.442810 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-sstpk" event={"ID":"05d9b5c2-543c-4bc0-a92a-c8433467bc7a","Type":"ContainerStarted","Data":"6ba9b63136ecf44bde9dc1c1d103e9a0fe0ec4177c6110ba2a612d3f25d9917c"} Apr 21 16:04:44.443254 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.443234 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:44.444821 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.444777 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6c6bb55-p2wvg" event={"ID":"4048a878-2e43-4099-8919-7a59e04e0dc0","Type":"ContainerStarted","Data":"fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17"} Apr 21 16:04:44.445221 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.444829 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6c6bb55-p2wvg" event={"ID":"4048a878-2e43-4099-8919-7a59e04e0dc0","Type":"ContainerStarted","Data":"fef6796b4a880440a2fe6bc3b9b4c6279a14fc0698239f0a67707076f3d94760"} Apr 21 16:04:44.446446 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.446425 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"2d5c7d67423adef1e9e25c01e95b5c8b11b61b1cdf764ea5c0dd99252f47f670"} Apr 21 16:04:44.447627 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.447603 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"f5226762ce68b9d3a1f1fcac6170e93cac0b9a17b0fb79be390b7c718e2b38c1"} Apr 21 16:04:44.448740 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.448715 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" event={"ID":"f90b4ba6-859d-43df-8063-4b7311a0faaa","Type":"ContainerStarted","Data":"f641b64fddbef345c52cffbf394c34bfcc16bb3f804654675dd8dd8529a57926"} Apr 21 16:04:44.450293 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.450178 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jbm6m" event={"ID":"72dfee7f-8b16-43e2-860b-0ef8b4a63261","Type":"ContainerStarted","Data":"85d9cfc3f44a70706f6a8865adc3634a36bc2966e2724e682773015155467c2e"} Apr 21 16:04:44.458609 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.458586 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-sstpk" Apr 21 16:04:44.463994 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.463948 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-sstpk" podStartSLOduration=2.09832553 podStartE2EDuration="18.463933925s" podCreationTimestamp="2026-04-21 16:04:26 +0000 UTC" firstStartedPulling="2026-04-21 16:04:27.472924404 +0000 UTC m=+159.217862187" lastFinishedPulling="2026-04-21 16:04:43.838532782 +0000 UTC m=+175.583470582" observedRunningTime="2026-04-21 16:04:44.4619064 +0000 UTC m=+176.206844205" watchObservedRunningTime="2026-04-21 16:04:44.463933925 +0000 UTC m=+176.208871731" Apr 21 16:04:44.516356 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:44.516061 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dd6c6bb55-p2wvg" podStartSLOduration=3.516041525 podStartE2EDuration="3.516041525s" podCreationTimestamp="2026-04-21 16:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:44.514602287 +0000 UTC m=+176.259540090" watchObservedRunningTime="2026-04-21 16:04:44.516041525 +0000 UTC m=+176.260979333" Apr 21 16:04:45.456956 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:45.456887 2562 generic.go:358] "Generic (PLEG): container finished" podID="72dfee7f-8b16-43e2-860b-0ef8b4a63261" containerID="0aa0c76d0b25c74b3d54fed50a0d15c0d308a5c4c4368658bd4d9393dddad67c" exitCode=0 Apr 21 16:04:45.457651 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:45.457213 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jbm6m" event={"ID":"72dfee7f-8b16-43e2-860b-0ef8b4a63261","Type":"ContainerDied","Data":"0aa0c76d0b25c74b3d54fed50a0d15c0d308a5c4c4368658bd4d9393dddad67c"} Apr 21 16:04:46.462862 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:46.462820 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jbm6m" event={"ID":"72dfee7f-8b16-43e2-860b-0ef8b4a63261","Type":"ContainerStarted","Data":"7d20c85a9ad68c03e7ac13ee8e18897009c6d05932f8d67f5f992fb5f5bcaf28"} Apr 21 16:04:47.467843 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.467809 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="a17df501fc0af1c21c4f6bff33b0dac7b269e8d65249a4c9e814c982564f3a94" exitCode=0 Apr 21 16:04:47.468220 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.467891 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"a17df501fc0af1c21c4f6bff33b0dac7b269e8d65249a4c9e814c982564f3a94"} Apr 21 16:04:47.469667 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.469560 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c" exitCode=0 Apr 21 16:04:47.469667 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.469648 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c"} Apr 21 16:04:47.471735 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.471713 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" event={"ID":"f90b4ba6-859d-43df-8063-4b7311a0faaa","Type":"ContainerStarted","Data":"e35cc52bedfe782a16a6f92dc2db839aeb6a4dcbcd0d9eb5e3a3ff16ce06eb26"} Apr 21 16:04:47.473983 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.473896 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jbm6m" event={"ID":"72dfee7f-8b16-43e2-860b-0ef8b4a63261","Type":"ContainerStarted","Data":"b92ed4bdad706fa928ea2b35f06805027ac29164ff0b84ed499b8078a0e51ffb"} Apr 21 16:04:47.600286 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.599563 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" podStartSLOduration=4.65583837 podStartE2EDuration="7.599544051s" podCreationTimestamp="2026-04-21 16:04:40 +0000 UTC" firstStartedPulling="2026-04-21 16:04:44.243067539 +0000 UTC m=+175.988005323" lastFinishedPulling="2026-04-21 16:04:47.186773207 +0000 UTC m=+178.931711004" observedRunningTime="2026-04-21 16:04:47.598639527 +0000 UTC m=+179.343577350" watchObservedRunningTime="2026-04-21 16:04:47.599544051 +0000 UTC m=+179.344481855" Apr 21 16:04:47.631710 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:47.631664 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jbm6m" podStartSLOduration=10.912182201 podStartE2EDuration="11.631647683s" podCreationTimestamp="2026-04-21 16:04:36 +0000 UTC" firstStartedPulling="2026-04-21 16:04:43.776737294 +0000 UTC m=+175.521675080" lastFinishedPulling="2026-04-21 16:04:44.496202766 +0000 UTC m=+176.241140562" observedRunningTime="2026-04-21 16:04:47.63044832 +0000 UTC m=+179.375386127" watchObservedRunningTime="2026-04-21 16:04:47.631647683 +0000 UTC m=+179.376585488" Apr 21 16:04:48.482994 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:48.482689 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p8j5k" event={"ID":"50caee65-e2ab-4233-a2b5-e5ea4a951bed","Type":"ContainerStarted","Data":"00c62468b12c8d50e63c7060a8b80032c280209ea115a72828b5c8dc29e9b843"} Apr 21 16:04:48.482994 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:48.482734 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p8j5k" event={"ID":"50caee65-e2ab-4233-a2b5-e5ea4a951bed","Type":"ContainerStarted","Data":"4c0915bdbc8c1c0995bd836bbe6b1d5853464e401c1b3b26a42c37938d35f176"} Apr 21 16:04:48.483685 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:48.483277 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p8j5k" Apr 21 16:04:49.393346 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:49.393292 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d5497b59c-rnrbh" Apr 21 16:04:49.423915 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:49.423860 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p8j5k" podStartSLOduration=144.458681948 podStartE2EDuration="2m27.423844768s" podCreationTimestamp="2026-04-21 16:02:22 +0000 UTC" firstStartedPulling="2026-04-21 16:04:44.227154608 +0000 UTC m=+175.972092391" lastFinishedPulling="2026-04-21 16:04:47.192317428 +0000 UTC m=+178.937255211" observedRunningTime="2026-04-21 16:04:48.503163518 +0000 UTC m=+180.248101324" watchObservedRunningTime="2026-04-21 16:04:49.423844768 +0000 UTC m=+181.168782572" Apr 21 16:04:51.501213 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:51.501171 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd"} Apr 21 16:04:51.826557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:51.826173 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:51.826557 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:51.826211 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:51.834590 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:51.834567 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:52.508187 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.508148 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"af8c2e47412bd1c3ecb45ed728d1c5a52f53f87f76d50cc410a6aef42574ad14"} Apr 21 16:04:52.508686 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.508195 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"ff6d3335c288e65953e62a25e761818fbbcab1e9a5a721517080a7943255e266"} Apr 21 16:04:52.512206 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.512172 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034"} Apr 21 16:04:52.512436 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.512215 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578"} Apr 21 16:04:52.512436 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.512232 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321"} Apr 21 16:04:52.512436 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.512243 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45"} Apr 21 16:04:52.517701 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.517517 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:04:52.570513 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:52.570485 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68bfcc5c9f-s2l52"] Apr 21 16:04:54.521156 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:54.521081 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerStarted","Data":"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c"} Apr 21 16:04:54.548615 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:54.548580 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=7.288059839 podStartE2EDuration="17.548566584s" podCreationTimestamp="2026-04-21 16:04:37 +0000 UTC" firstStartedPulling="2026-04-21 16:04:43.996327283 +0000 UTC m=+175.741265069" lastFinishedPulling="2026-04-21 16:04:54.256834015 +0000 UTC m=+186.001771814" observedRunningTime="2026-04-21 16:04:54.548109995 +0000 UTC m=+186.293047800" watchObservedRunningTime="2026-04-21 16:04:54.548566584 +0000 UTC m=+186.293504387" Apr 21 16:04:55.527045 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:55.527007 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"24c1eac3769c78bb99984761a98087b4221c6d076b373a0b9c44912c3bcb2c27"} Apr 21 16:04:55.527466 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:55.527054 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"e50fa9f11edb7346b20780424d6c44a3288391773856f7734a158ed09a8c8218"} Apr 21 16:04:55.527466 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:55.527068 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"7fbf74cfa42fa3552749fc15387d46a7d8543d4c04bc446b24271d5dc58dc0c7"} Apr 21 16:04:55.527466 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:55.527080 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerStarted","Data":"72ace214cd111c73b01914be1650b2f2c9a047b8eb24f5532e0fbb7ab783d291"} Apr 21 16:04:55.563180 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:55.563126 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.933528948 podStartE2EDuration="13.563107661s" podCreationTimestamp="2026-04-21 16:04:42 +0000 UTC" firstStartedPulling="2026-04-21 16:04:43.989682999 +0000 UTC m=+175.734620791" lastFinishedPulling="2026-04-21 16:04:54.61926172 +0000 UTC m=+186.364199504" observedRunningTime="2026-04-21 16:04:55.560960767 +0000 UTC m=+187.305898594" watchObservedRunningTime="2026-04-21 16:04:55.563107661 +0000 UTC m=+187.308045467" Apr 21 16:04:57.427269 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:57.427240 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:04:59.495927 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:04:59.495897 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p8j5k" Apr 21 16:05:01.106509 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:01.106480 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:05:01.106509 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:01.106518 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:05:07.560291 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:07.560257 2562 generic.go:358] "Generic (PLEG): container finished" podID="c5d3a65e-5e28-4860-a01a-277b576a947b" containerID="54dcee25c6fd6b812d1cdace6e1039ab26e4dfd2c44ae7d8773ddd422e909600" exitCode=0 Apr 21 16:05:07.560688 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:07.560331 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" event={"ID":"c5d3a65e-5e28-4860-a01a-277b576a947b","Type":"ContainerDied","Data":"54dcee25c6fd6b812d1cdace6e1039ab26e4dfd2c44ae7d8773ddd422e909600"} Apr 21 16:05:07.560688 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:07.560617 2562 scope.go:117] "RemoveContainer" containerID="54dcee25c6fd6b812d1cdace6e1039ab26e4dfd2c44ae7d8773ddd422e909600" Apr 21 16:05:08.565205 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:08.565169 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-994j7" event={"ID":"c5d3a65e-5e28-4860-a01a-277b576a947b","Type":"ContainerStarted","Data":"0b85fbfdfc517360b0d571cc31a14db1c6c972bccffdfbd43561b1ad232dd9fd"} Apr 21 16:05:17.598680 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:17.598622 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68bfcc5c9f-s2l52" podUID="29a2cb07-76ea-41df-a7e6-f42b090d7d77" containerName="console" containerID="cri-o://ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811" gracePeriod=15 Apr 21 16:05:17.881887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:17.868561 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68bfcc5c9f-s2l52_29a2cb07-76ea-41df-a7e6-f42b090d7d77/console/0.log" Apr 21 16:05:17.881887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:17.868652 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:05:18.002105 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002055 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-oauth-serving-cert\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002300 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002130 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mt8\" (UniqueName: \"kubernetes.io/projected/29a2cb07-76ea-41df-a7e6-f42b090d7d77-kube-api-access-88mt8\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002300 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002196 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-serving-cert\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002300 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002240 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-oauth-config\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002300 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002271 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-service-ca\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002307 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-trusted-ca-bundle\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002487 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002334 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-config\") pod \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\" (UID: \"29a2cb07-76ea-41df-a7e6-f42b090d7d77\") " Apr 21 16:05:18.002999 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.002967 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-config" (OuterVolumeSpecName: "console-config") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:18.003362 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.003338 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:18.006996 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.003978 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:18.006996 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.003987 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-service-ca" (OuterVolumeSpecName: "service-ca") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:18.007515 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.007312 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:18.008034 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.007994 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:18.009753 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.009728 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a2cb07-76ea-41df-a7e6-f42b090d7d77-kube-api-access-88mt8" (OuterVolumeSpecName: "kube-api-access-88mt8") pod "29a2cb07-76ea-41df-a7e6-f42b090d7d77" (UID: "29a2cb07-76ea-41df-a7e6-f42b090d7d77"). InnerVolumeSpecName "kube-api-access-88mt8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103490 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88mt8\" (UniqueName: \"kubernetes.io/projected/29a2cb07-76ea-41df-a7e6-f42b090d7d77-kube-api-access-88mt8\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103519 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-serving-cert\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103528 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-oauth-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103538 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-service-ca\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103547 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-trusted-ca-bundle\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103555 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-console-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.103571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.103564 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/29a2cb07-76ea-41df-a7e6-f42b090d7d77-oauth-serving-cert\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:18.596732 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.596705 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68bfcc5c9f-s2l52_29a2cb07-76ea-41df-a7e6-f42b090d7d77/console/0.log" Apr 21 16:05:18.596917 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.596748 2562 generic.go:358] "Generic (PLEG): container finished" podID="29a2cb07-76ea-41df-a7e6-f42b090d7d77" containerID="ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811" exitCode=2 Apr 21 16:05:18.596917 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.596845 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68bfcc5c9f-s2l52" event={"ID":"29a2cb07-76ea-41df-a7e6-f42b090d7d77","Type":"ContainerDied","Data":"ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811"} Apr 21 16:05:18.596917 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.596862 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68bfcc5c9f-s2l52" Apr 21 16:05:18.596917 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.596887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68bfcc5c9f-s2l52" event={"ID":"29a2cb07-76ea-41df-a7e6-f42b090d7d77","Type":"ContainerDied","Data":"3f08b0f2d1e3488d2292ebf4a10c2aefb80f9d33e1536a8ecc5894c3c696c37c"} Apr 21 16:05:18.596917 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.596906 2562 scope.go:117] "RemoveContainer" containerID="ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811" Apr 21 16:05:18.605652 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.605489 2562 scope.go:117] "RemoveContainer" containerID="ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811" Apr 21 16:05:18.605896 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:18.605810 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811\": container with ID starting with ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811 not found: ID does not exist" containerID="ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811" Apr 21 16:05:18.605896 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.605846 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811"} err="failed to get container status \"ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811\": rpc error: code = NotFound desc = could not find container \"ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811\": container with ID starting with ca3b2259ba21a6b1c314acdc270278918b5df86595146a665b8728b00a902811 not found: ID does not exist" Apr 21 16:05:18.621712 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.621686 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68bfcc5c9f-s2l52"] Apr 21 16:05:18.631861 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.631839 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68bfcc5c9f-s2l52"] Apr 21 16:05:18.910159 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:18.910058 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a2cb07-76ea-41df-a7e6-f42b090d7d77" path="/var/lib/kubelet/pods/29a2cb07-76ea-41df-a7e6-f42b090d7d77/volumes" Apr 21 16:05:21.112917 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:21.112887 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:05:21.116757 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:21.116735 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5bfd5c857b-fhtvj" Apr 21 16:05:22.614150 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:22.614116 2562 generic.go:358] "Generic (PLEG): container finished" podID="be01fbb6-f686-41d2-aaa3-1abd80d94c27" containerID="ad0843344a58ad3a93df68e4e7c1623a9a7ac0db80e5a83ba97251d68f5d9091" exitCode=0 Apr 21 16:05:22.614547 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:22.614147 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" event={"ID":"be01fbb6-f686-41d2-aaa3-1abd80d94c27","Type":"ContainerDied","Data":"ad0843344a58ad3a93df68e4e7c1623a9a7ac0db80e5a83ba97251d68f5d9091"} Apr 21 16:05:22.614547 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:22.614479 2562 scope.go:117] "RemoveContainer" containerID="ad0843344a58ad3a93df68e4e7c1623a9a7ac0db80e5a83ba97251d68f5d9091" Apr 21 16:05:23.618577 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:23.618545 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r7jtg" event={"ID":"be01fbb6-f686-41d2-aaa3-1abd80d94c27","Type":"ContainerStarted","Data":"9ba23033b377f5ae66e729e763d7c374fb37cf7ec16ae2f06ea00220ccf4f2da"} Apr 21 16:05:42.427684 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:42.427628 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:05:42.447363 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:42.447338 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:05:42.690195 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:42.690121 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:05:56.214802 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.214755 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:05:56.215268 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.215167 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="alertmanager" containerID="cri-o://ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd" gracePeriod=120 Apr 21 16:05:56.215347 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.215247 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-web" containerID="cri-o://053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321" gracePeriod=120 Apr 21 16:05:56.215401 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.215316 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="config-reloader" containerID="cri-o://8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45" gracePeriod=120 Apr 21 16:05:56.215401 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.215251 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-metric" containerID="cri-o://c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034" gracePeriod=120 Apr 21 16:05:56.215401 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.215336 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy" containerID="cri-o://79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578" gracePeriod=120 Apr 21 16:05:56.216110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.215556 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="prom-label-proxy" containerID="cri-o://2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c" gracePeriod=120 Apr 21 16:05:56.717721 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717690 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c" exitCode=0 Apr 21 16:05:56.717721 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717714 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578" exitCode=0 Apr 21 16:05:56.717721 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717722 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45" exitCode=0 Apr 21 16:05:56.717966 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717729 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd" exitCode=0 Apr 21 16:05:56.717966 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717770 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c"} Apr 21 16:05:56.717966 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717813 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578"} Apr 21 16:05:56.717966 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45"} Apr 21 16:05:56.717966 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:56.717831 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd"} Apr 21 16:05:57.456810 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.456769 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.560539 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560471 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-main-db\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560539 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560514 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-volume\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560738 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560551 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-cluster-tls-config\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560738 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560583 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560738 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560621 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560738 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560725 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560774 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-web-config\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560827 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-trusted-ca-bundle\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560832 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:57.560985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560909 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-metrics-client-ca\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560937 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29hzz\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-kube-api-access-29hzz\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.560985 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.560972 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-out\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.561282 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.561003 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-tls-assets\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.561282 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.561028 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-main-tls\") pod \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\" (UID: \"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f\") " Apr 21 16:05:57.561390 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.561324 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-main-db\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.562226 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.562195 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:57.562350 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.562223 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:57.563550 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.563524 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-volume" (OuterVolumeSpecName: "config-volume") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.564109 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.564071 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.564211 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.564188 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.564298 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.564280 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.564359 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.564330 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-out" (OuterVolumeSpecName: "config-out") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:57.564628 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.564593 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.564721 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.564636 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-kube-api-access-29hzz" (OuterVolumeSpecName: "kube-api-access-29hzz") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "kube-api-access-29hzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:57.565086 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.565057 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:57.568723 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.568696 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.574999 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.574978 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-web-config" (OuterVolumeSpecName: "web-config") pod "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" (UID: "8066f9f3-f9e7-4d74-9413-bf9a14e14d7f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:57.661814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661764 2562 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-metrics-client-ca\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661815 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29hzz\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-kube-api-access-29hzz\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661825 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-out\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661834 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-tls-assets\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661845 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-main-tls\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661854 2562 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-config-volume\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661862 2562 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-cluster-tls-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661871 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661880 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661890 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661900 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-web-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.661928 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.661909 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:05:57.723746 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723716 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034" exitCode=0 Apr 21 16:05:57.723746 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723741 2562 generic.go:358] "Generic (PLEG): container finished" podID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerID="053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321" exitCode=0 Apr 21 16:05:57.723909 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723762 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034"} Apr 21 16:05:57.723909 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723800 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321"} Apr 21 16:05:57.723909 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723811 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8066f9f3-f9e7-4d74-9413-bf9a14e14d7f","Type":"ContainerDied","Data":"f5226762ce68b9d3a1f1fcac6170e93cac0b9a17b0fb79be390b7c718e2b38c1"} Apr 21 16:05:57.723909 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723826 2562 scope.go:117] "RemoveContainer" containerID="2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c" Apr 21 16:05:57.723909 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.723835 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.731381 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.731361 2562 scope.go:117] "RemoveContainer" containerID="c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034" Apr 21 16:05:57.737998 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.737982 2562 scope.go:117] "RemoveContainer" containerID="79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578" Apr 21 16:05:57.745511 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.745494 2562 scope.go:117] "RemoveContainer" containerID="053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321" Apr 21 16:05:57.751410 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.751393 2562 scope.go:117] "RemoveContainer" containerID="8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45" Apr 21 16:05:57.754577 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.754554 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:05:57.758078 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.758062 2562 scope.go:117] "RemoveContainer" containerID="ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd" Apr 21 16:05:57.766521 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.766500 2562 scope.go:117] "RemoveContainer" containerID="ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c" Apr 21 16:05:57.769602 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.769584 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:05:57.773114 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773099 2562 scope.go:117] "RemoveContainer" containerID="2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c" Apr 21 16:05:57.773368 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.773343 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c\": container with ID starting with 2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c not found: ID does not exist" containerID="2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c" Apr 21 16:05:57.773422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773374 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c"} err="failed to get container status \"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c\": rpc error: code = NotFound desc = could not find container \"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c\": container with ID starting with 2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c not found: ID does not exist" Apr 21 16:05:57.773422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773392 2562 scope.go:117] "RemoveContainer" containerID="c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034" Apr 21 16:05:57.773597 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.773583 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034\": container with ID starting with c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034 not found: ID does not exist" containerID="c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034" Apr 21 16:05:57.773639 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773601 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034"} err="failed to get container status \"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034\": rpc error: code = NotFound desc = could not find container \"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034\": container with ID starting with c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034 not found: ID does not exist" Apr 21 16:05:57.773639 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773623 2562 scope.go:117] "RemoveContainer" containerID="79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578" Apr 21 16:05:57.773821 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.773805 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578\": container with ID starting with 79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578 not found: ID does not exist" containerID="79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578" Apr 21 16:05:57.773880 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773824 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578"} err="failed to get container status \"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578\": rpc error: code = NotFound desc = could not find container \"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578\": container with ID starting with 79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578 not found: ID does not exist" Apr 21 16:05:57.773880 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.773835 2562 scope.go:117] "RemoveContainer" containerID="053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321" Apr 21 16:05:57.774075 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.774058 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321\": container with ID starting with 053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321 not found: ID does not exist" containerID="053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321" Apr 21 16:05:57.774120 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.774082 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321"} err="failed to get container status \"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321\": rpc error: code = NotFound desc = could not find container \"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321\": container with ID starting with 053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321 not found: ID does not exist" Apr 21 16:05:57.774120 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.774104 2562 scope.go:117] "RemoveContainer" containerID="8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45" Apr 21 16:05:57.774337 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.774320 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45\": container with ID starting with 8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45 not found: ID does not exist" containerID="8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45" Apr 21 16:05:57.774385 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.774341 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45"} err="failed to get container status \"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45\": rpc error: code = NotFound desc = could not find container \"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45\": container with ID starting with 8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45 not found: ID does not exist" Apr 21 16:05:57.774385 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.774355 2562 scope.go:117] "RemoveContainer" containerID="ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd" Apr 21 16:05:57.774570 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.774553 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd\": container with ID starting with ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd not found: ID does not exist" containerID="ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd" Apr 21 16:05:57.774626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.774587 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd"} err="failed to get container status \"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd\": rpc error: code = NotFound desc = could not find container \"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd\": container with ID starting with ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd not found: ID does not exist" Apr 21 16:05:57.774626 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.774608 2562 scope.go:117] "RemoveContainer" containerID="ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c" Apr 21 16:05:57.775055 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:05:57.775035 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c\": container with ID starting with ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c not found: ID does not exist" containerID="ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c" Apr 21 16:05:57.775133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775060 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c"} err="failed to get container status \"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c\": rpc error: code = NotFound desc = could not find container \"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c\": container with ID starting with ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c not found: ID does not exist" Apr 21 16:05:57.775133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775080 2562 scope.go:117] "RemoveContainer" containerID="2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c" Apr 21 16:05:57.775297 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775279 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c"} err="failed to get container status \"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c\": rpc error: code = NotFound desc = could not find container \"2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c\": container with ID starting with 2bbad2d5fddd40e01b842c4d32aad39e76adb4e55614e67bea39f8207abcea5c not found: ID does not exist" Apr 21 16:05:57.775348 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775298 2562 scope.go:117] "RemoveContainer" containerID="c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034" Apr 21 16:05:57.775504 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775486 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034"} err="failed to get container status \"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034\": rpc error: code = NotFound desc = could not find container \"c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034\": container with ID starting with c9661a358557cf1c87e59ff31da83e55f21e739bee4bb1d7f24897b56d11f034 not found: ID does not exist" Apr 21 16:05:57.775544 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775505 2562 scope.go:117] "RemoveContainer" containerID="79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578" Apr 21 16:05:57.775730 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775712 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578"} err="failed to get container status \"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578\": rpc error: code = NotFound desc = could not find container \"79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578\": container with ID starting with 79b3f42ed0c4816cddff516f1f13fced3c3bd0fa9b2269ab693387207cc8f578 not found: ID does not exist" Apr 21 16:05:57.775814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775732 2562 scope.go:117] "RemoveContainer" containerID="053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321" Apr 21 16:05:57.775936 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775917 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321"} err="failed to get container status \"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321\": rpc error: code = NotFound desc = could not find container \"053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321\": container with ID starting with 053c56b30f55538de9ce031daebd9a6a83116615d7c96ca3a1eadec806056321 not found: ID does not exist" Apr 21 16:05:57.775986 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.775936 2562 scope.go:117] "RemoveContainer" containerID="8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45" Apr 21 16:05:57.776116 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.776099 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45"} err="failed to get container status \"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45\": rpc error: code = NotFound desc = could not find container \"8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45\": container with ID starting with 8f4547799688b2883ca54a64f45135402a53531fbee2471b40b45819cf446d45 not found: ID does not exist" Apr 21 16:05:57.776159 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.776116 2562 scope.go:117] "RemoveContainer" containerID="ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd" Apr 21 16:05:57.776328 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.776310 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd"} err="failed to get container status \"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd\": rpc error: code = NotFound desc = could not find container \"ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd\": container with ID starting with ae29a028859930675c9c62b8a72e96b61d0902475649f357e19760c2c6a127fd not found: ID does not exist" Apr 21 16:05:57.776377 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.776329 2562 scope.go:117] "RemoveContainer" containerID="ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c" Apr 21 16:05:57.776549 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.776532 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c"} err="failed to get container status \"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c\": rpc error: code = NotFound desc = could not find container \"ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c\": container with ID starting with ae97bf6c319556a2a6318f4a56461812d275b9f92e6230312a6cadd52bd55a8c not found: ID does not exist" Apr 21 16:05:57.803038 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803012 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:05:57.803289 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803278 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-web" Apr 21 16:05:57.803330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803291 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-web" Apr 21 16:05:57.803330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803301 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29a2cb07-76ea-41df-a7e6-f42b090d7d77" containerName="console" Apr 21 16:05:57.803330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803308 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2cb07-76ea-41df-a7e6-f42b090d7d77" containerName="console" Apr 21 16:05:57.803330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803315 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-metric" Apr 21 16:05:57.803330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803320 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-metric" Apr 21 16:05:57.803330 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803330 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="config-reloader" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803336 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="config-reloader" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803344 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803349 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803356 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="prom-label-proxy" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803362 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="prom-label-proxy" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803367 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="alertmanager" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803371 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="alertmanager" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803380 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="init-config-reloader" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803385 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="init-config-reloader" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803426 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803435 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-metric" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803441 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="kube-rbac-proxy-web" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803448 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="config-reloader" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803454 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="29a2cb07-76ea-41df-a7e6-f42b090d7d77" containerName="console" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803460 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="prom-label-proxy" Apr 21 16:05:57.803503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.803466 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" containerName="alertmanager" Apr 21 16:05:57.808057 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.808041 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.812102 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.812052 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 16:05:57.813800 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.813575 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 16:05:57.813800 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.813610 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 16:05:57.813953 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.813901 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 16:05:57.813953 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.813929 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6hzrx\"" Apr 21 16:05:57.813953 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.813935 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 16:05:57.814111 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.813999 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 16:05:57.814222 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.814206 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 16:05:57.814303 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.814285 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 16:05:57.819507 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.819490 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 16:05:57.825032 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.825012 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:05:57.864669 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864649 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e59172a8-fd83-4930-9415-2cc933f5953b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.864766 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864676 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.864766 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864693 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.864766 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864716 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e59172a8-fd83-4930-9415-2cc933f5953b-config-out\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.864913 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864761 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e59172a8-fd83-4930-9415-2cc933f5953b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.864913 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864831 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw8n9\" (UniqueName: \"kubernetes.io/projected/e59172a8-fd83-4930-9415-2cc933f5953b-kube-api-access-jw8n9\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.864913 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864895 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.865004 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864913 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59172a8-fd83-4930-9415-2cc933f5953b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.865004 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864937 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-config-volume\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.865004 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.865004 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.864987 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.865131 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.865003 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e59172a8-fd83-4930-9415-2cc933f5953b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.865131 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.865036 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-web-config\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966006 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.965981 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966117 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966010 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59172a8-fd83-4930-9415-2cc933f5953b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966117 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966036 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-config-volume\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966193 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966161 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966193 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966189 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966266 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966208 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e59172a8-fd83-4930-9415-2cc933f5953b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966266 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966241 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-web-config\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966367 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966306 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e59172a8-fd83-4930-9415-2cc933f5953b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966367 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966334 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966367 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966357 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966516 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966395 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e59172a8-fd83-4930-9415-2cc933f5953b-config-out\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966516 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966425 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e59172a8-fd83-4930-9415-2cc933f5953b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.966516 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.966449 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw8n9\" (UniqueName: \"kubernetes.io/projected/e59172a8-fd83-4930-9415-2cc933f5953b-kube-api-access-jw8n9\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.967042 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.967020 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e59172a8-fd83-4930-9415-2cc933f5953b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.967505 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.967056 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59172a8-fd83-4930-9415-2cc933f5953b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.967624 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.967499 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e59172a8-fd83-4930-9415-2cc933f5953b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.969059 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.969017 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-config-volume\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.969160 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.969096 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.969326 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.969299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.969548 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.969526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.969653 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.969634 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e59172a8-fd83-4930-9415-2cc933f5953b-config-out\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.969752 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.969728 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.970108 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.970079 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e59172a8-fd83-4930-9415-2cc933f5953b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.970417 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.970398 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.970532 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.970514 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e59172a8-fd83-4930-9415-2cc933f5953b-web-config\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:57.976595 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:57.976576 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw8n9\" (UniqueName: \"kubernetes.io/projected/e59172a8-fd83-4930-9415-2cc933f5953b-kube-api-access-jw8n9\") pod \"alertmanager-main-0\" (UID: \"e59172a8-fd83-4930-9415-2cc933f5953b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:58.117485 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:58.117423 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 16:05:58.240959 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:58.240930 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 16:05:58.243903 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:05:58.243869 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59172a8_fd83_4930_9415_2cc933f5953b.slice/crio-8ff16ac42abaf8828b2cf44e8ba4c22477912b2dc24d00e9b70dddf9c7bcda37 WatchSource:0}: Error finding container 8ff16ac42abaf8828b2cf44e8ba4c22477912b2dc24d00e9b70dddf9c7bcda37: Status 404 returned error can't find the container with id 8ff16ac42abaf8828b2cf44e8ba4c22477912b2dc24d00e9b70dddf9c7bcda37 Apr 21 16:05:58.728654 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:58.728624 2562 generic.go:358] "Generic (PLEG): container finished" podID="e59172a8-fd83-4930-9415-2cc933f5953b" containerID="8a92fee2c583c7e7b9569cb38a3f23317e71924f0558fe8bbc66473bf0da20be" exitCode=0 Apr 21 16:05:58.729119 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:58.728712 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerDied","Data":"8a92fee2c583c7e7b9569cb38a3f23317e71924f0558fe8bbc66473bf0da20be"} Apr 21 16:05:58.729119 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:58.728744 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"8ff16ac42abaf8828b2cf44e8ba4c22477912b2dc24d00e9b70dddf9c7bcda37"} Apr 21 16:05:58.909818 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:58.909769 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8066f9f3-f9e7-4d74-9413-bf9a14e14d7f" path="/var/lib/kubelet/pods/8066f9f3-f9e7-4d74-9413-bf9a14e14d7f/volumes" Apr 21 16:05:59.735667 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.735637 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"a92c0833c48c1ccdd27ed2104e27392c232ef142027ea828f02be44dfeca20cc"} Apr 21 16:05:59.735667 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.735670 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"14eeec919485cd8445834e5e778034eb85c8fbb5cefc6cfd343c33b17da2ba39"} Apr 21 16:05:59.736082 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.735679 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"38c323bc844101e831633df58e8c7dcf230683012f5a3ac95fd089a6a8dbdecf"} Apr 21 16:05:59.736082 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.735690 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"172a9ebecd056402d7e92a14c6ea8f839f23fc05888913a9d951be7ee30c7b5b"} Apr 21 16:05:59.736082 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.735698 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"9c7b6200eb4b3009095b0e9e3e18b8b8bf9e816106abe5895f1f1b555868c1dc"} Apr 21 16:05:59.736082 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.735705 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e59172a8-fd83-4930-9415-2cc933f5953b","Type":"ContainerStarted","Data":"ec7af7d23d3542ec08e3a059e4ee17babfbfa55b013ba72e56cabd2d5fab43c8"} Apr 21 16:05:59.769456 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:05:59.769411 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.7693973339999998 podStartE2EDuration="2.769397334s" podCreationTimestamp="2026-04-21 16:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:05:59.766876647 +0000 UTC m=+251.511814451" watchObservedRunningTime="2026-04-21 16:05:59.769397334 +0000 UTC m=+251.514335188" Apr 21 16:06:00.545711 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.545680 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:06:00.546202 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.546120 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="prometheus" containerID="cri-o://ff6d3335c288e65953e62a25e761818fbbcab1e9a5a721517080a7943255e266" gracePeriod=600 Apr 21 16:06:00.546202 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.546162 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-thanos" containerID="cri-o://24c1eac3769c78bb99984761a98087b4221c6d076b373a0b9c44912c3bcb2c27" gracePeriod=600 Apr 21 16:06:00.546346 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.546185 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-web" containerID="cri-o://7fbf74cfa42fa3552749fc15387d46a7d8543d4c04bc446b24271d5dc58dc0c7" gracePeriod=600 Apr 21 16:06:00.546346 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.546146 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy" containerID="cri-o://e50fa9f11edb7346b20780424d6c44a3288391773856f7734a158ed09a8c8218" gracePeriod=600 Apr 21 16:06:00.546346 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.546155 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="thanos-sidecar" containerID="cri-o://72ace214cd111c73b01914be1650b2f2c9a047b8eb24f5532e0fbb7ab783d291" gracePeriod=600 Apr 21 16:06:00.546346 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.546167 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="config-reloader" containerID="cri-o://af8c2e47412bd1c3ecb45ed728d1c5a52f53f87f76d50cc410a6aef42574ad14" gracePeriod=600 Apr 21 16:06:00.688151 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.688120 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:06:00.690201 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.690173 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e022d7cd-e433-4f58-8b33-7c830d23f95c-metrics-certs\") pod \"network-metrics-daemon-rg8v9\" (UID: \"e022d7cd-e433-4f58-8b33-7c830d23f95c\") " pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:06:00.748850 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748819 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="24c1eac3769c78bb99984761a98087b4221c6d076b373a0b9c44912c3bcb2c27" exitCode=0 Apr 21 16:06:00.748850 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748847 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="e50fa9f11edb7346b20780424d6c44a3288391773856f7734a158ed09a8c8218" exitCode=0 Apr 21 16:06:00.748850 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748854 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="7fbf74cfa42fa3552749fc15387d46a7d8543d4c04bc446b24271d5dc58dc0c7" exitCode=0 Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748863 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="72ace214cd111c73b01914be1650b2f2c9a047b8eb24f5532e0fbb7ab783d291" exitCode=0 Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748885 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="af8c2e47412bd1c3ecb45ed728d1c5a52f53f87f76d50cc410a6aef42574ad14" exitCode=0 Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748894 2562 generic.go:358] "Generic (PLEG): container finished" podID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerID="ff6d3335c288e65953e62a25e761818fbbcab1e9a5a721517080a7943255e266" exitCode=0 Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748889 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"24c1eac3769c78bb99984761a98087b4221c6d076b373a0b9c44912c3bcb2c27"} Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748920 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"e50fa9f11edb7346b20780424d6c44a3288391773856f7734a158ed09a8c8218"} Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748933 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"7fbf74cfa42fa3552749fc15387d46a7d8543d4c04bc446b24271d5dc58dc0c7"} Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748944 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"72ace214cd111c73b01914be1650b2f2c9a047b8eb24f5532e0fbb7ab783d291"} Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748955 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"af8c2e47412bd1c3ecb45ed728d1c5a52f53f87f76d50cc410a6aef42574ad14"} Apr 21 16:06:00.749323 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.748979 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"ff6d3335c288e65953e62a25e761818fbbcab1e9a5a721517080a7943255e266"} Apr 21 16:06:00.782740 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.782720 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:00.889924 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.889845 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-rulefiles-0\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.889924 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.889909 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.889943 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-thanos-prometheus-http-client-file\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.889972 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-db\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890010 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-trusted-ca-bundle\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890045 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-config-out\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890067 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-kubelet-serving-ca-bundle\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890133 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890105 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-tls-assets\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890148 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890175 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-config\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890198 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-web-config\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890229 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-kube-rbac-proxy\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890259 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-metrics-client-ca\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890289 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8245\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-kube-api-access-q8245\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890318 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-serving-certs-ca-bundle\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890361 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-tls\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890387 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-metrics-client-certs\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890425 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-grpc-tls\") pod \"8f383819-ded1-46ea-b736-11b2ae87bba7\" (UID: \"8f383819-ded1-46ea-b736-11b2ae87bba7\") " Apr 21 16:06:00.890901 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890844 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:00.890954 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.890935 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-metrics-client-ca\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.892940 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.891128 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:00.892940 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.891202 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:00.892940 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.891287 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:06:00.892940 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.892220 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:00.892940 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.892680 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:00.893697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.893668 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-config-out" (OuterVolumeSpecName: "config-out") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:06:00.893697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.893689 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.893913 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.893887 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-kube-api-access-q8245" (OuterVolumeSpecName: "kube-api-access-q8245") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "kube-api-access-q8245". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:06:00.894210 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.894167 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.894385 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.894332 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:06:00.894385 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.894344 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.894705 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.894669 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.895185 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.895161 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.895368 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.895345 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-config" (OuterVolumeSpecName: "config") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.895586 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.895564 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.895686 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.895666 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.905187 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.905166 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-web-config" (OuterVolumeSpecName: "web-config") pod "8f383819-ded1-46ea-b736-11b2ae87bba7" (UID: "8f383819-ded1-46ea-b736-11b2ae87bba7"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:00.910258 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.910238 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cz78x\"" Apr 21 16:06:00.917747 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.917731 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg8v9" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992529 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992565 2562 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992582 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-web-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992596 2562 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-kube-rbac-proxy\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992609 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8245\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-kube-api-access-q8245\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992623 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992639 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992654 2562 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-metrics-client-certs\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992668 2562 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-grpc-tls\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992681 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992696 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992711 2562 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8f383819-ded1-46ea-b736-11b2ae87bba7-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992725 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-k8s-db\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992741 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992756 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8f383819-ded1-46ea-b736-11b2ae87bba7-config-out\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992773 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f383819-ded1-46ea-b736-11b2ae87bba7-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:00.992838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:00.992806 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8f383819-ded1-46ea-b736-11b2ae87bba7-tls-assets\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:01.047129 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.047103 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rg8v9"] Apr 21 16:06:01.049678 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:06:01.049655 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode022d7cd_e433_4f58_8b33_7c830d23f95c.slice/crio-e6b107a49e3d6b665f213ecd288ef4d1d333cb1fbf063820be5a84fe8bda9bf6 WatchSource:0}: Error finding container e6b107a49e3d6b665f213ecd288ef4d1d333cb1fbf063820be5a84fe8bda9bf6: Status 404 returned error can't find the container with id e6b107a49e3d6b665f213ecd288ef4d1d333cb1fbf063820be5a84fe8bda9bf6 Apr 21 16:06:01.755949 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.755910 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8f383819-ded1-46ea-b736-11b2ae87bba7","Type":"ContainerDied","Data":"2d5c7d67423adef1e9e25c01e95b5c8b11b61b1cdf764ea5c0dd99252f47f670"} Apr 21 16:06:01.756378 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.755955 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.756378 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.755972 2562 scope.go:117] "RemoveContainer" containerID="24c1eac3769c78bb99984761a98087b4221c6d076b373a0b9c44912c3bcb2c27" Apr 21 16:06:01.757356 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.757267 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg8v9" event={"ID":"e022d7cd-e433-4f58-8b33-7c830d23f95c","Type":"ContainerStarted","Data":"e6b107a49e3d6b665f213ecd288ef4d1d333cb1fbf063820be5a84fe8bda9bf6"} Apr 21 16:06:01.779815 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.779755 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:06:01.784893 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.784873 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:06:01.820494 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820434 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:06:01.820730 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820716 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="config-reloader" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820731 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="config-reloader" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820747 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="prometheus" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820755 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="prometheus" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820769 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-thanos" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820792 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-thanos" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820806 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="init-config-reloader" Apr 21 16:06:01.820811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820812 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="init-config-reloader" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820818 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="thanos-sidecar" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820823 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="thanos-sidecar" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820829 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820834 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820845 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-web" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820850 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-web" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820913 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="thanos-sidecar" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820923 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-thanos" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820933 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy-web" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820945 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="kube-rbac-proxy" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820957 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="prometheus" Apr 21 16:06:01.821064 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.820967 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" containerName="config-reloader" Apr 21 16:06:01.825597 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.825580 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.829286 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.829266 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 16:06:01.830115 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830003 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 16:06:01.830493 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830471 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 16:06:01.830597 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830520 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 16:06:01.830597 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830581 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 16:06:01.830699 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830685 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 16:06:01.830754 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830707 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-z42vl\"" Apr 21 16:06:01.830960 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.830924 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6q6tav1q5175i\"" Apr 21 16:06:01.831061 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.831042 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 16:06:01.831709 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.831689 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 16:06:01.831823 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.831750 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 16:06:01.831905 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.831885 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 16:06:01.833814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.833761 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 16:06:01.835998 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.835976 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 16:06:01.840760 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.840659 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:06:01.902242 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902210 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e491dc9e-c581-4b95-a000-22e7c59757a7-config-out\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902388 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902248 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902388 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902274 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902388 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902341 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e491dc9e-c581-4b95-a000-22e7c59757a7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902388 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902364 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902388 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902387 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902412 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902447 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902471 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902495 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902550 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w474c\" (UniqueName: \"kubernetes.io/projected/e491dc9e-c581-4b95-a000-22e7c59757a7-kube-api-access-w474c\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902570 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902595 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902619 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-web-config\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.902642 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.903082 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-config\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.903082 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.902711 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:01.936496 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.936479 2562 scope.go:117] "RemoveContainer" containerID="e50fa9f11edb7346b20780424d6c44a3288391773856f7734a158ed09a8c8218" Apr 21 16:06:01.943509 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.943479 2562 scope.go:117] "RemoveContainer" containerID="7fbf74cfa42fa3552749fc15387d46a7d8543d4c04bc446b24271d5dc58dc0c7" Apr 21 16:06:01.950124 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.950106 2562 scope.go:117] "RemoveContainer" containerID="72ace214cd111c73b01914be1650b2f2c9a047b8eb24f5532e0fbb7ab783d291" Apr 21 16:06:01.956287 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.956271 2562 scope.go:117] "RemoveContainer" containerID="af8c2e47412bd1c3ecb45ed728d1c5a52f53f87f76d50cc410a6aef42574ad14" Apr 21 16:06:01.981701 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.981687 2562 scope.go:117] "RemoveContainer" containerID="ff6d3335c288e65953e62a25e761818fbbcab1e9a5a721517080a7943255e266" Apr 21 16:06:01.991424 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:01.991408 2562 scope.go:117] "RemoveContainer" containerID="a17df501fc0af1c21c4f6bff33b0dac7b269e8d65249a4c9e814c982564f3a94" Apr 21 16:06:02.003510 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003491 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e491dc9e-c581-4b95-a000-22e7c59757a7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003579 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003516 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003579 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003535 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003579 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003554 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003700 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003680 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003753 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003720 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003825 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003753 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003825 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003801 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003831 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w474c\" (UniqueName: \"kubernetes.io/projected/e491dc9e-c581-4b95-a000-22e7c59757a7-kube-api-access-w474c\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003861 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003888 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.003922 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003916 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-web-config\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003949 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.003979 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-config\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004022 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004056 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e491dc9e-c581-4b95-a000-22e7c59757a7-config-out\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004079 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004110 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004103 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004413 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004348 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004749 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004725 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.004867 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.004766 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.005615 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.005590 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.005843 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.005820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.006422 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.006401 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.006517 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.006497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e491dc9e-c581-4b95-a000-22e7c59757a7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.006988 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.006965 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-config\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.007879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.007548 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.008876 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.008559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.008876 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.008773 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.009275 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.009226 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-web-config\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.009360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.009312 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.009494 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.009473 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.009615 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.009597 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e491dc9e-c581-4b95-a000-22e7c59757a7-config-out\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.010163 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.010142 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e491dc9e-c581-4b95-a000-22e7c59757a7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.010284 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.010267 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e491dc9e-c581-4b95-a000-22e7c59757a7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.019035 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.019015 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w474c\" (UniqueName: \"kubernetes.io/projected/e491dc9e-c581-4b95-a000-22e7c59757a7-kube-api-access-w474c\") pod \"prometheus-k8s-0\" (UID: \"e491dc9e-c581-4b95-a000-22e7c59757a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.137971 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.137763 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:02.287336 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.287296 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 16:06:02.289345 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:06:02.289309 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode491dc9e_c581_4b95_a000_22e7c59757a7.slice/crio-38c88175c2db25c2edb709c1e65d7a680d626c59cb7bcc1a5bab06f52e32a0e3 WatchSource:0}: Error finding container 38c88175c2db25c2edb709c1e65d7a680d626c59cb7bcc1a5bab06f52e32a0e3: Status 404 returned error can't find the container with id 38c88175c2db25c2edb709c1e65d7a680d626c59cb7bcc1a5bab06f52e32a0e3 Apr 21 16:06:02.762452 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.762416 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg8v9" event={"ID":"e022d7cd-e433-4f58-8b33-7c830d23f95c","Type":"ContainerStarted","Data":"b3f87c82edc08aa58400b1c5ee90873c2a196afe24c6db322c672433af147f8f"} Apr 21 16:06:02.762452 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.762452 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg8v9" event={"ID":"e022d7cd-e433-4f58-8b33-7c830d23f95c","Type":"ContainerStarted","Data":"59756828792df73a49c5a8f377ccc5ad77f01390de4512305cc0a923c0d09b90"} Apr 21 16:06:02.763636 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.763614 2562 generic.go:358] "Generic (PLEG): container finished" podID="e491dc9e-c581-4b95-a000-22e7c59757a7" containerID="f4698150b8d449e79128557547dc00a8b40842cf17d52a891f2d3ad1bd81d804" exitCode=0 Apr 21 16:06:02.763702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.763679 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerDied","Data":"f4698150b8d449e79128557547dc00a8b40842cf17d52a891f2d3ad1bd81d804"} Apr 21 16:06:02.763702 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.763696 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"38c88175c2db25c2edb709c1e65d7a680d626c59cb7bcc1a5bab06f52e32a0e3"} Apr 21 16:06:02.789440 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.789394 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rg8v9" podStartSLOduration=253.847876366 podStartE2EDuration="4m14.789381134s" podCreationTimestamp="2026-04-21 16:01:48 +0000 UTC" firstStartedPulling="2026-04-21 16:06:01.051475175 +0000 UTC m=+252.796412958" lastFinishedPulling="2026-04-21 16:06:01.992979939 +0000 UTC m=+253.737917726" observedRunningTime="2026-04-21 16:06:02.787488261 +0000 UTC m=+254.532426067" watchObservedRunningTime="2026-04-21 16:06:02.789381134 +0000 UTC m=+254.534318939" Apr 21 16:06:02.911028 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:02.911002 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f383819-ded1-46ea-b736-11b2ae87bba7" path="/var/lib/kubelet/pods/8f383819-ded1-46ea-b736-11b2ae87bba7/volumes" Apr 21 16:06:03.769670 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.769631 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"89db061f4d9a440b8596a8a29357901a7af2b90562b43b7337d1c0ccd8b84824"} Apr 21 16:06:03.769670 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.769671 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"0afefbf14774dfd76028c63cf0d31367fa375be09d73a175760b2de60133a477"} Apr 21 16:06:03.770157 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.769681 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"3025ab5586f94df267166bf9c752a37a9020b2e941c1e8b1a98ebdbce2511b57"} Apr 21 16:06:03.770157 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.769690 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"48d9dc90b4aff7724d3f2fcb5af76d849531fb8f1ef5897619dc4a711738c8cd"} Apr 21 16:06:03.770157 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.769702 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"5fc3709ad0850423394b5c6edd922ce552f98d12381c354aae1e78ee9d172d90"} Apr 21 16:06:03.770157 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.769710 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e491dc9e-c581-4b95-a000-22e7c59757a7","Type":"ContainerStarted","Data":"0f8d76384c8eb54f618e4bf10d899481e27270bc413456c9af659fd8d7587629"} Apr 21 16:06:03.806155 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:03.806107 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.806084071 podStartE2EDuration="2.806084071s" podCreationTimestamp="2026-04-21 16:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:06:03.802943138 +0000 UTC m=+255.547880956" watchObservedRunningTime="2026-04-21 16:06:03.806084071 +0000 UTC m=+255.551021876" Apr 21 16:06:07.138723 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:07.138684 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:06:17.244601 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:17.244563 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dd6c6bb55-p2wvg"] Apr 21 16:06:42.264085 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.264022 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dd6c6bb55-p2wvg" podUID="4048a878-2e43-4099-8919-7a59e04e0dc0" containerName="console" containerID="cri-o://fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17" gracePeriod=15 Apr 21 16:06:42.501460 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.501438 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dd6c6bb55-p2wvg_4048a878-2e43-4099-8919-7a59e04e0dc0/console/0.log" Apr 21 16:06:42.501568 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.501495 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:06:42.533633 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533561 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-service-ca\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.533633 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533607 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-console-config\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.533838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533641 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8zsq\" (UniqueName: \"kubernetes.io/projected/4048a878-2e43-4099-8919-7a59e04e0dc0-kube-api-access-s8zsq\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.533838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533673 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-trusted-ca-bundle\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.533838 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533756 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-oauth-serving-cert\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.533992 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533903 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-serving-cert\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.533992 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533959 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-oauth-config\") pod \"4048a878-2e43-4099-8919-7a59e04e0dc0\" (UID: \"4048a878-2e43-4099-8919-7a59e04e0dc0\") " Apr 21 16:06:42.534123 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.533964 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:42.534123 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.534076 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-console-config" (OuterVolumeSpecName: "console-config") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:42.534231 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.534151 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:42.534361 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.534345 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-service-ca\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.534430 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.534363 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-console-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.534430 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.534373 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-oauth-serving-cert\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.534430 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.534368 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:06:42.536094 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.536070 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4048a878-2e43-4099-8919-7a59e04e0dc0-kube-api-access-s8zsq" (OuterVolumeSpecName: "kube-api-access-s8zsq") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "kube-api-access-s8zsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:06:42.536210 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.536186 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:42.536284 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.536227 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4048a878-2e43-4099-8919-7a59e04e0dc0" (UID: "4048a878-2e43-4099-8919-7a59e04e0dc0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:06:42.635421 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.635399 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-serving-cert\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.635421 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.635420 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4048a878-2e43-4099-8919-7a59e04e0dc0-console-oauth-config\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.635558 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.635431 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8zsq\" (UniqueName: \"kubernetes.io/projected/4048a878-2e43-4099-8919-7a59e04e0dc0-kube-api-access-s8zsq\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.635558 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.635440 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4048a878-2e43-4099-8919-7a59e04e0dc0-trusted-ca-bundle\") on node \"ip-10-0-142-158.ec2.internal\" DevicePath \"\"" Apr 21 16:06:42.889663 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.889598 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dd6c6bb55-p2wvg_4048a878-2e43-4099-8919-7a59e04e0dc0/console/0.log" Apr 21 16:06:42.889663 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.889639 2562 generic.go:358] "Generic (PLEG): container finished" podID="4048a878-2e43-4099-8919-7a59e04e0dc0" containerID="fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17" exitCode=2 Apr 21 16:06:42.889887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.889699 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6c6bb55-p2wvg" event={"ID":"4048a878-2e43-4099-8919-7a59e04e0dc0","Type":"ContainerDied","Data":"fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17"} Apr 21 16:06:42.889887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.889711 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dd6c6bb55-p2wvg" Apr 21 16:06:42.889887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.889730 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dd6c6bb55-p2wvg" event={"ID":"4048a878-2e43-4099-8919-7a59e04e0dc0","Type":"ContainerDied","Data":"fef6796b4a880440a2fe6bc3b9b4c6279a14fc0698239f0a67707076f3d94760"} Apr 21 16:06:42.889887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.889750 2562 scope.go:117] "RemoveContainer" containerID="fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17" Apr 21 16:06:42.897967 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.897951 2562 scope.go:117] "RemoveContainer" containerID="fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17" Apr 21 16:06:42.898229 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:06:42.898206 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17\": container with ID starting with fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17 not found: ID does not exist" containerID="fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17" Apr 21 16:06:42.898273 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.898240 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17"} err="failed to get container status \"fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17\": rpc error: code = NotFound desc = could not find container \"fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17\": container with ID starting with fabac85b32904a2a4460611c2a9fff353bd6aa05877cb19e9299f4b78f714b17 not found: ID does not exist" Apr 21 16:06:42.913086 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.913060 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dd6c6bb55-p2wvg"] Apr 21 16:06:42.916483 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:42.916462 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dd6c6bb55-p2wvg"] Apr 21 16:06:44.910126 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:44.910082 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4048a878-2e43-4099-8919-7a59e04e0dc0" path="/var/lib/kubelet/pods/4048a878-2e43-4099-8919-7a59e04e0dc0/volumes" Apr 21 16:06:48.782229 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:48.782184 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:06:48.782698 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:48.782679 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:06:48.800027 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:06:48.800011 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 16:07:02.138568 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:07:02.138535 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:07:02.153762 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:07:02.153739 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:07:02.959868 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:07:02.959840 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 16:09:25.277149 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.277107 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc"] Apr 21 16:09:25.277560 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.277447 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4048a878-2e43-4099-8919-7a59e04e0dc0" containerName="console" Apr 21 16:09:25.277560 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.277461 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="4048a878-2e43-4099-8919-7a59e04e0dc0" containerName="console" Apr 21 16:09:25.277560 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.277532 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="4048a878-2e43-4099-8919-7a59e04e0dc0" containerName="console" Apr 21 16:09:25.280581 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.280562 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.284420 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.284401 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 16:09:25.284528 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.284475 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 16:09:25.284619 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.284605 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5mxtt\"" Apr 21 16:09:25.284670 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.284627 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 16:09:25.285108 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.285094 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 16:09:25.296335 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.296315 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc"] Apr 21 16:09:25.421590 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.421514 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/799c2b2f-4eab-4074-8c0f-e8e634e28877-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.421590 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.421555 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/799c2b2f-4eab-4074-8c0f-e8e634e28877-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.421755 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.421612 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgv9\" (UniqueName: \"kubernetes.io/projected/799c2b2f-4eab-4074-8c0f-e8e634e28877-kube-api-access-8kgv9\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.523010 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.522980 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgv9\" (UniqueName: \"kubernetes.io/projected/799c2b2f-4eab-4074-8c0f-e8e634e28877-kube-api-access-8kgv9\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.523160 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.523042 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/799c2b2f-4eab-4074-8c0f-e8e634e28877-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.523160 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.523095 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/799c2b2f-4eab-4074-8c0f-e8e634e28877-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.525415 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.525386 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/799c2b2f-4eab-4074-8c0f-e8e634e28877-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.525525 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.525426 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/799c2b2f-4eab-4074-8c0f-e8e634e28877-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.540096 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.540075 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgv9\" (UniqueName: \"kubernetes.io/projected/799c2b2f-4eab-4074-8c0f-e8e634e28877-kube-api-access-8kgv9\") pod \"opendatahub-operator-controller-manager-774f54dc87-59rrc\" (UID: \"799c2b2f-4eab-4074-8c0f-e8e634e28877\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.590957 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.590930 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:25.678651 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.678583 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w"] Apr 21 16:09:25.683355 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.683331 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.687024 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.687003 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 16:09:25.687648 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.687480 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 16:09:25.687648 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.687503 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-mmz8p\"" Apr 21 16:09:25.687648 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.687519 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 16:09:25.687648 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.687519 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:09:25.687648 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.687487 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 16:09:25.692438 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.692413 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w"] Apr 21 16:09:25.738224 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.738203 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc"] Apr 21 16:09:25.739592 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:09:25.739565 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799c2b2f_4eab_4074_8c0f_e8e634e28877.slice/crio-8bf11d34869f9a0113c13d766f5369be96b765d9d0bc3ed5d60e7cde2b19626d WatchSource:0}: Error finding container 8bf11d34869f9a0113c13d766f5369be96b765d9d0bc3ed5d60e7cde2b19626d: Status 404 returned error can't find the container with id 8bf11d34869f9a0113c13d766f5369be96b765d9d0bc3ed5d60e7cde2b19626d Apr 21 16:09:25.741265 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.741250 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:09:25.826326 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.826297 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6ae7e5a-34a0-497a-b449-df1e47f2575b-cert\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.826474 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.826347 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b6ae7e5a-34a0-497a-b449-df1e47f2575b-manager-config\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.826474 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.826369 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6ae7e5a-34a0-497a-b449-df1e47f2575b-metrics-cert\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.826562 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.826493 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslvf\" (UniqueName: \"kubernetes.io/projected/b6ae7e5a-34a0-497a-b449-df1e47f2575b-kube-api-access-sslvf\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.927091 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.927060 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sslvf\" (UniqueName: \"kubernetes.io/projected/b6ae7e5a-34a0-497a-b449-df1e47f2575b-kube-api-access-sslvf\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.927228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.927107 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6ae7e5a-34a0-497a-b449-df1e47f2575b-cert\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.927228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.927138 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b6ae7e5a-34a0-497a-b449-df1e47f2575b-manager-config\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.927228 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.927156 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6ae7e5a-34a0-497a-b449-df1e47f2575b-metrics-cert\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.927754 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.927735 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b6ae7e5a-34a0-497a-b449-df1e47f2575b-manager-config\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.929546 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.929494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6ae7e5a-34a0-497a-b449-df1e47f2575b-cert\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.929627 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.929545 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6ae7e5a-34a0-497a-b449-df1e47f2575b-metrics-cert\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.937571 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.937551 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslvf\" (UniqueName: \"kubernetes.io/projected/b6ae7e5a-34a0-497a-b449-df1e47f2575b-kube-api-access-sslvf\") pod \"lws-controller-manager-d98d6987c-jlt6w\" (UID: \"b6ae7e5a-34a0-497a-b449-df1e47f2575b\") " pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:25.995503 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:25.995480 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:26.187900 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:26.187818 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w"] Apr 21 16:09:26.187900 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:09:26.187818 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ae7e5a_34a0_497a_b449_df1e47f2575b.slice/crio-355117487ec9f753f9ff3cbc2b6446d90e7970dcc7f2e46de6236943244eecd7 WatchSource:0}: Error finding container 355117487ec9f753f9ff3cbc2b6446d90e7970dcc7f2e46de6236943244eecd7: Status 404 returned error can't find the container with id 355117487ec9f753f9ff3cbc2b6446d90e7970dcc7f2e46de6236943244eecd7 Apr 21 16:09:26.353689 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:26.353651 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" event={"ID":"b6ae7e5a-34a0-497a-b449-df1e47f2575b","Type":"ContainerStarted","Data":"355117487ec9f753f9ff3cbc2b6446d90e7970dcc7f2e46de6236943244eecd7"} Apr 21 16:09:26.354824 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:26.354776 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" event={"ID":"799c2b2f-4eab-4074-8c0f-e8e634e28877","Type":"ContainerStarted","Data":"8bf11d34869f9a0113c13d766f5369be96b765d9d0bc3ed5d60e7cde2b19626d"} Apr 21 16:09:29.369510 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:29.369476 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" event={"ID":"799c2b2f-4eab-4074-8c0f-e8e634e28877","Type":"ContainerStarted","Data":"21ad76314c68007f14a2fab6dd7fabf2d9dc6976fc1ca5a6378bb4c9a0959147"} Apr 21 16:09:29.369929 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:29.369646 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:29.394802 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:29.394698 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" podStartSLOduration=1.832427743 podStartE2EDuration="4.394680188s" podCreationTimestamp="2026-04-21 16:09:25 +0000 UTC" firstStartedPulling="2026-04-21 16:09:25.741363698 +0000 UTC m=+457.486301481" lastFinishedPulling="2026-04-21 16:09:28.303616139 +0000 UTC m=+460.048553926" observedRunningTime="2026-04-21 16:09:29.39293763 +0000 UTC m=+461.137875439" watchObservedRunningTime="2026-04-21 16:09:29.394680188 +0000 UTC m=+461.139617993" Apr 21 16:09:30.375154 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:30.375068 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" event={"ID":"b6ae7e5a-34a0-497a-b449-df1e47f2575b","Type":"ContainerStarted","Data":"865ff456077ca9f61652bffcbb2e65aab26d37194f0545b5e662b29c14fa9c56"} Apr 21 16:09:30.375608 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:30.375193 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:30.394980 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:30.394939 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" podStartSLOduration=1.5360036620000002 podStartE2EDuration="5.394928297s" podCreationTimestamp="2026-04-21 16:09:25 +0000 UTC" firstStartedPulling="2026-04-21 16:09:26.189881038 +0000 UTC m=+457.934818834" lastFinishedPulling="2026-04-21 16:09:30.048805682 +0000 UTC m=+461.793743469" observedRunningTime="2026-04-21 16:09:30.394710819 +0000 UTC m=+462.139648625" watchObservedRunningTime="2026-04-21 16:09:30.394928297 +0000 UTC m=+462.139866115" Apr 21 16:09:40.377510 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:40.377481 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-59rrc" Apr 21 16:09:41.381162 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:41.381133 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-d98d6987c-jlt6w" Apr 21 16:09:44.885183 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.885154 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm"] Apr 21 16:09:44.888285 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.888270 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:44.890908 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.890885 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 16:09:44.890908 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.890887 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 16:09:44.892276 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.892259 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 16:09:44.892370 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.892294 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-97p9h\"" Apr 21 16:09:44.892370 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.892358 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 16:09:44.905335 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.905316 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm"] Apr 21 16:09:44.970638 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.970613 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64d99728-f189-4ea9-9c45-de36d549fbae-tmp\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:44.970800 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.970651 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7md2k\" (UniqueName: \"kubernetes.io/projected/64d99728-f189-4ea9-9c45-de36d549fbae-kube-api-access-7md2k\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:44.970800 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:44.970681 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99728-f189-4ea9-9c45-de36d549fbae-tls-certs\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.071058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.071035 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64d99728-f189-4ea9-9c45-de36d549fbae-tmp\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.071173 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.071145 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7md2k\" (UniqueName: \"kubernetes.io/projected/64d99728-f189-4ea9-9c45-de36d549fbae-kube-api-access-7md2k\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.071214 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.071174 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99728-f189-4ea9-9c45-de36d549fbae-tls-certs\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.073161 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.073140 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/64d99728-f189-4ea9-9c45-de36d549fbae-tmp\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.073416 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.073400 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/64d99728-f189-4ea9-9c45-de36d549fbae-tls-certs\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.081058 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.081029 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7md2k\" (UniqueName: \"kubernetes.io/projected/64d99728-f189-4ea9-9c45-de36d549fbae-kube-api-access-7md2k\") pod \"kube-auth-proxy-5598cc66fd-wwcdm\" (UID: \"64d99728-f189-4ea9-9c45-de36d549fbae\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.198004 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.197945 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" Apr 21 16:09:45.319750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.319725 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm"] Apr 21 16:09:45.321604 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:09:45.321573 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d99728_f189_4ea9_9c45_de36d549fbae.slice/crio-54df2b70a012fe557d3fdd3849bb3425dd7e199eb299097de235a36086934c47 WatchSource:0}: Error finding container 54df2b70a012fe557d3fdd3849bb3425dd7e199eb299097de235a36086934c47: Status 404 returned error can't find the container with id 54df2b70a012fe557d3fdd3849bb3425dd7e199eb299097de235a36086934c47 Apr 21 16:09:45.425162 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:45.425131 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" event={"ID":"64d99728-f189-4ea9-9c45-de36d549fbae","Type":"ContainerStarted","Data":"54df2b70a012fe557d3fdd3849bb3425dd7e199eb299097de235a36086934c47"} Apr 21 16:09:49.162075 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:49.162037 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 16:09:49.440711 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:49.440633 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" event={"ID":"64d99728-f189-4ea9-9c45-de36d549fbae","Type":"ContainerStarted","Data":"1e953e2c821118e9813d2016a86932555415b15991f4c097cbb9267457ac8f4a"} Apr 21 16:09:49.459458 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:09:49.459414 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-wwcdm" podStartSLOduration=1.624345977 podStartE2EDuration="5.459399743s" podCreationTimestamp="2026-04-21 16:09:44 +0000 UTC" firstStartedPulling="2026-04-21 16:09:45.323334141 +0000 UTC m=+477.068271925" lastFinishedPulling="2026-04-21 16:09:49.158387905 +0000 UTC m=+480.903325691" observedRunningTime="2026-04-21 16:09:49.4581947 +0000 UTC m=+481.203132526" watchObservedRunningTime="2026-04-21 16:09:49.459399743 +0000 UTC m=+481.204337636" Apr 21 16:11:37.899206 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.899176 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt"] Apr 21 16:11:37.901371 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.901355 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:37.904370 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.904349 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 16:11:37.904872 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.904855 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hpb5g\"" Apr 21 16:11:37.905806 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.905773 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 16:11:37.905900 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.905887 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 16:11:37.905965 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.905946 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 16:11:37.926828 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:37.926807 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt"] Apr 21 16:11:38.012875 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.012835 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/375b55eb-6f78-449f-a34a-d4fe300b233a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.013071 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.012943 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hbcl\" (UniqueName: \"kubernetes.io/projected/375b55eb-6f78-449f-a34a-d4fe300b233a-kube-api-access-6hbcl\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.013071 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.013010 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/375b55eb-6f78-449f-a34a-d4fe300b233a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.114200 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.114163 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/375b55eb-6f78-449f-a34a-d4fe300b233a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.114384 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.114229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/375b55eb-6f78-449f-a34a-d4fe300b233a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.114384 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.114265 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hbcl\" (UniqueName: \"kubernetes.io/projected/375b55eb-6f78-449f-a34a-d4fe300b233a-kube-api-access-6hbcl\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.114384 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:11:38.114327 2562 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 21 16:11:38.114573 ip-10-0-142-158 kubenswrapper[2562]: E0421 16:11:38.114421 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/375b55eb-6f78-449f-a34a-d4fe300b233a-plugin-serving-cert podName:375b55eb-6f78-449f-a34a-d4fe300b233a nodeName:}" failed. No retries permitted until 2026-04-21 16:11:38.614397978 +0000 UTC m=+590.359335760 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/375b55eb-6f78-449f-a34a-d4fe300b233a-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-xm5tt" (UID: "375b55eb-6f78-449f-a34a-d4fe300b233a") : secret "plugin-serving-cert" not found Apr 21 16:11:38.115049 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.115028 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/375b55eb-6f78-449f-a34a-d4fe300b233a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.130404 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.130378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hbcl\" (UniqueName: \"kubernetes.io/projected/375b55eb-6f78-449f-a34a-d4fe300b233a-kube-api-access-6hbcl\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.618414 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.618374 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/375b55eb-6f78-449f-a34a-d4fe300b233a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.620827 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.620778 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/375b55eb-6f78-449f-a34a-d4fe300b233a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xm5tt\" (UID: \"375b55eb-6f78-449f-a34a-d4fe300b233a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.820050 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.820020 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" Apr 21 16:11:38.946662 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:38.946638 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt"] Apr 21 16:11:38.949029 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:11:38.949001 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod375b55eb_6f78_449f_a34a_d4fe300b233a.slice/crio-b85ec9565266238e8ab70df73b2041bb6ef7d060549b347e95914b1e73ba6efc WatchSource:0}: Error finding container b85ec9565266238e8ab70df73b2041bb6ef7d060549b347e95914b1e73ba6efc: Status 404 returned error can't find the container with id b85ec9565266238e8ab70df73b2041bb6ef7d060549b347e95914b1e73ba6efc Apr 21 16:11:39.803409 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:39.803368 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" event={"ID":"375b55eb-6f78-449f-a34a-d4fe300b233a","Type":"ContainerStarted","Data":"b85ec9565266238e8ab70df73b2041bb6ef7d060549b347e95914b1e73ba6efc"} Apr 21 16:11:48.815399 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:48.815352 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:11:48.816676 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:11:48.816653 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:12:05.899860 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:05.899824 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" event={"ID":"375b55eb-6f78-449f-a34a-d4fe300b233a","Type":"ContainerStarted","Data":"c536c0c9e088a1ca17dfd240aee2e15c2035693408f3abb13809a1490d6593dc"} Apr 21 16:12:05.919756 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:05.919701 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xm5tt" podStartSLOduration=2.6250354270000003 podStartE2EDuration="28.919688991s" podCreationTimestamp="2026-04-21 16:11:37 +0000 UTC" firstStartedPulling="2026-04-21 16:11:38.950287319 +0000 UTC m=+590.695225102" lastFinishedPulling="2026-04-21 16:12:05.244940868 +0000 UTC m=+616.989878666" observedRunningTime="2026-04-21 16:12:05.919170256 +0000 UTC m=+617.664108064" watchObservedRunningTime="2026-04-21 16:12:05.919688991 +0000 UTC m=+617.664626795" Apr 21 16:12:26.303588 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.303551 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:12:26.353887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.353859 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:12:26.353887 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.353885 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:12:26.354081 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.354002 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.356972 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.356947 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 16:12:26.438311 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.438281 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc2acb96-33c6-4635-b84c-e3800e4f8460-config-file\") pod \"limitador-limitador-78c99df468-wsqc4\" (UID: \"cc2acb96-33c6-4635-b84c-e3800e4f8460\") " pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.438442 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.438317 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbn2w\" (UniqueName: \"kubernetes.io/projected/cc2acb96-33c6-4635-b84c-e3800e4f8460-kube-api-access-nbn2w\") pod \"limitador-limitador-78c99df468-wsqc4\" (UID: \"cc2acb96-33c6-4635-b84c-e3800e4f8460\") " pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.540010 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.539982 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc2acb96-33c6-4635-b84c-e3800e4f8460-config-file\") pod \"limitador-limitador-78c99df468-wsqc4\" (UID: \"cc2acb96-33c6-4635-b84c-e3800e4f8460\") " pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.540149 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.540018 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbn2w\" (UniqueName: \"kubernetes.io/projected/cc2acb96-33c6-4635-b84c-e3800e4f8460-kube-api-access-nbn2w\") pod \"limitador-limitador-78c99df468-wsqc4\" (UID: \"cc2acb96-33c6-4635-b84c-e3800e4f8460\") " pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.540541 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.540522 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc2acb96-33c6-4635-b84c-e3800e4f8460-config-file\") pod \"limitador-limitador-78c99df468-wsqc4\" (UID: \"cc2acb96-33c6-4635-b84c-e3800e4f8460\") " pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.548730 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.548703 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbn2w\" (UniqueName: \"kubernetes.io/projected/cc2acb96-33c6-4635-b84c-e3800e4f8460-kube-api-access-nbn2w\") pod \"limitador-limitador-78c99df468-wsqc4\" (UID: \"cc2acb96-33c6-4635-b84c-e3800e4f8460\") " pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.663697 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.663624 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:26.786674 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.786642 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:12:26.790947 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:12:26.790920 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2acb96_33c6_4635_b84c_e3800e4f8460.slice/crio-dd42cd3bb933c18695c1340f4b3dc8389b1ea66dad0794baa9df4d868010aed7 WatchSource:0}: Error finding container dd42cd3bb933c18695c1340f4b3dc8389b1ea66dad0794baa9df4d868010aed7: Status 404 returned error can't find the container with id dd42cd3bb933c18695c1340f4b3dc8389b1ea66dad0794baa9df4d868010aed7 Apr 21 16:12:26.968063 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:26.967979 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" event={"ID":"cc2acb96-33c6-4635-b84c-e3800e4f8460","Type":"ContainerStarted","Data":"dd42cd3bb933c18695c1340f4b3dc8389b1ea66dad0794baa9df4d868010aed7"} Apr 21 16:12:30.983954 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:30.983877 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" event={"ID":"cc2acb96-33c6-4635-b84c-e3800e4f8460","Type":"ContainerStarted","Data":"5267ae11920106c444a4c8b87c00fb3b929b597dc0f921341f5e5965824e6c45"} Apr 21 16:12:30.983954 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:30.983947 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:12:31.002940 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:31.002895 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" podStartSLOduration=1.307401021 podStartE2EDuration="5.002883794s" podCreationTimestamp="2026-04-21 16:12:26 +0000 UTC" firstStartedPulling="2026-04-21 16:12:26.792605853 +0000 UTC m=+638.537543637" lastFinishedPulling="2026-04-21 16:12:30.48808862 +0000 UTC m=+642.233026410" observedRunningTime="2026-04-21 16:12:31.001067544 +0000 UTC m=+642.746005349" watchObservedRunningTime="2026-04-21 16:12:31.002883794 +0000 UTC m=+642.747821600" Apr 21 16:12:41.988519 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:12:41.988487 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-wsqc4" Apr 21 16:13:07.939852 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:07.939818 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:13:37.420423 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:37.420345 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:13:46.790257 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.790226 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7fc6cd87c-8c8v4"] Apr 21 16:13:46.792428 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.792410 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:46.795911 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.795879 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hhkxk\"" Apr 21 16:13:46.796031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.795959 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 16:13:46.796031 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.795970 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 16:13:46.802160 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.802138 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fc6cd87c-8c8v4"] Apr 21 16:13:46.921529 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.921500 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/60cc832e-6b90-4416-baea-170c655b093c-maas-api-tls\") pod \"maas-api-7fc6cd87c-8c8v4\" (UID: \"60cc832e-6b90-4416-baea-170c655b093c\") " pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:46.921673 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.921560 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpk89\" (UniqueName: \"kubernetes.io/projected/60cc832e-6b90-4416-baea-170c655b093c-kube-api-access-dpk89\") pod \"maas-api-7fc6cd87c-8c8v4\" (UID: \"60cc832e-6b90-4416-baea-170c655b093c\") " pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:46.959462 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:46.959426 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:13:47.022298 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:47.022271 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/60cc832e-6b90-4416-baea-170c655b093c-maas-api-tls\") pod \"maas-api-7fc6cd87c-8c8v4\" (UID: \"60cc832e-6b90-4416-baea-170c655b093c\") " pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:47.022673 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:47.022647 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpk89\" (UniqueName: \"kubernetes.io/projected/60cc832e-6b90-4416-baea-170c655b093c-kube-api-access-dpk89\") pod \"maas-api-7fc6cd87c-8c8v4\" (UID: \"60cc832e-6b90-4416-baea-170c655b093c\") " pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:47.024693 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:47.024674 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/60cc832e-6b90-4416-baea-170c655b093c-maas-api-tls\") pod \"maas-api-7fc6cd87c-8c8v4\" (UID: \"60cc832e-6b90-4416-baea-170c655b093c\") " pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:47.040996 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:47.040938 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpk89\" (UniqueName: \"kubernetes.io/projected/60cc832e-6b90-4416-baea-170c655b093c-kube-api-access-dpk89\") pod \"maas-api-7fc6cd87c-8c8v4\" (UID: \"60cc832e-6b90-4416-baea-170c655b093c\") " pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:47.102811 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:47.102773 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:47.445884 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:47.445843 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7fc6cd87c-8c8v4"] Apr 21 16:13:47.448451 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:13:47.448423 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60cc832e_6b90_4416_baea_170c655b093c.slice/crio-688f37ab8bd6d2ef5225ce0ec31a8bdc05a97ef7f03499973efeaee8f29f04ee WatchSource:0}: Error finding container 688f37ab8bd6d2ef5225ce0ec31a8bdc05a97ef7f03499973efeaee8f29f04ee: Status 404 returned error can't find the container with id 688f37ab8bd6d2ef5225ce0ec31a8bdc05a97ef7f03499973efeaee8f29f04ee Apr 21 16:13:48.238223 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:48.238171 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" event={"ID":"60cc832e-6b90-4416-baea-170c655b093c","Type":"ContainerStarted","Data":"688f37ab8bd6d2ef5225ce0ec31a8bdc05a97ef7f03499973efeaee8f29f04ee"} Apr 21 16:13:48.771543 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:48.771507 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:13:49.577341 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:49.577318 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 16:13:50.246055 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:50.246015 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" event={"ID":"60cc832e-6b90-4416-baea-170c655b093c","Type":"ContainerStarted","Data":"7ccc08bc6e09128623ae9e0fd27de308828f2af77643c4bf85127b19f5bd4bbf"} Apr 21 16:13:50.246237 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:50.246125 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:50.269304 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:50.269259 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" podStartSLOduration=2.144152258 podStartE2EDuration="4.269246781s" podCreationTimestamp="2026-04-21 16:13:46 +0000 UTC" firstStartedPulling="2026-04-21 16:13:47.449807602 +0000 UTC m=+719.194745385" lastFinishedPulling="2026-04-21 16:13:49.574902112 +0000 UTC m=+721.319839908" observedRunningTime="2026-04-21 16:13:50.266723063 +0000 UTC m=+722.011660867" watchObservedRunningTime="2026-04-21 16:13:50.269246781 +0000 UTC m=+722.014184639" Apr 21 16:13:56.254937 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:56.254908 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7fc6cd87c-8c8v4" Apr 21 16:13:57.774546 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:13:57.774508 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:14:03.186227 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:14:03.186191 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:14:23.572635 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:14:23.572600 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:14:35.764630 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:14:35.764595 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:15:45.671556 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:15:45.671520 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:15:56.271758 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:15:56.271724 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:16:05.664699 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:16:05.664665 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:16:16.428950 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:16:16.428915 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:16:24.563707 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:16:24.563665 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:16:34.866812 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:16:34.866761 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:16:48.840301 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:16:48.840220 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:16:48.842276 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:16:48.842250 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:17:37.464908 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:17:37.464871 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:17:53.072556 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:17:53.072522 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:18:32.066345 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:18:32.066260 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:18:47.860126 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:18:47.860087 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:19:03.172597 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:19:03.172559 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:19:19.470445 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:19:19.470408 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:19:46.391767 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:19:46.391694 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:19:50.575744 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:19:50.575707 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:20:12.872963 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:20:12.872926 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:20:22.577405 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:20:22.577368 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:20:38.979404 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:20:38.979369 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:20:47.092446 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:20:47.092403 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:21:04.474667 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:21:04.474636 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:21:12.479498 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:21:12.479418 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:21:45.767814 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:21:45.767757 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:21:48.863590 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:21:48.863553 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:21:48.868411 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:21:48.868388 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:21:53.273817 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:21:53.273136 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:22:01.878234 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:22:01.878196 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:22:10.280395 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:22:10.280359 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:22:18.965587 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:22:18.965552 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:22:36.271037 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:22:36.271002 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:22:49.178426 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:22:49.178349 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:23:35.971032 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:23:35.970993 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:23:44.024935 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:23:44.024902 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:23:53.472287 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:23:53.472253 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:02.686325 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:02.686291 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:11.274522 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:11.274443 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:19.479828 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:19.479796 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:28.979891 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:28.979853 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:36.881813 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:36.881765 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:45.878946 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:45.878914 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:24:54.167667 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:24:54.167632 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:04.177298 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:04.177265 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:12.662504 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:12.662467 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:21.816174 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:21.816134 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:29.091068 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:29.091037 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:39.088500 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:39.088420 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:46.249172 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:46.249141 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:25:56.974443 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:25:56.974409 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:26:04.786363 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:26:04.786329 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:26:48.887284 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:26:48.887244 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:26:48.892529 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:26:48.892507 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:28:20.774457 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:28:20.774418 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:28:28.066839 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:28:28.066806 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:28:53.077980 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:28:53.077890 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:00.265515 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:00.265479 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:09.577298 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:09.577265 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:19.669001 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:19.668963 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:27.769630 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:27.769597 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:38.863428 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:38.863394 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:47.682399 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:47.682364 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:29:58.586759 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:29:58.586725 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:30:07.477357 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:30:07.477277 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:30:16.880706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:30:16.880670 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:30:27.487224 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:30:27.487193 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:31:00.271735 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:31:00.271702 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:31:43.384958 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:31:43.384924 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:31:48.917333 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:31:48.917292 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:31:48.924688 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:31:48.924664 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:31:51.587267 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:31:51.587235 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:31:59.701335 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:31:59.701301 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:32:08.474449 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:32:08.474411 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:32:18.087204 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:32:18.085103 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:32:28.585532 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:32:28.585489 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:32:36.814708 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:32:36.814665 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:32:44.681845 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:32:44.681810 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:32:53.588251 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:32:53.588212 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:33:01.800618 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:33:01.800586 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:33:09.681340 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:33:09.681261 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:33:22.673991 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:33:22.673953 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:33:39.377595 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:33:39.377560 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:33:47.751630 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:33:47.751595 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:33:56.892462 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:33:56.892425 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:34:04.781312 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:34:04.781280 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:34:23.182376 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:34:23.182345 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:34:30.681211 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:34:30.681173 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:34:40.182054 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:34:40.181967 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:34:48.085071 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:34:48.085032 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:34:57.164329 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:34:57.164292 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:35:05.576074 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:35:05.576035 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:35:14.298522 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:35:14.298485 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:35:28.067117 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:35:28.067081 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:35:37.966984 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:35:37.966950 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:35:47.865827 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:35:47.865793 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:35:56.680134 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:35:56.680098 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:03.379478 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:03.379443 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:14.412102 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:14.412013 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:21.671878 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:21.671841 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:37.369074 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:37.369038 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:45.770640 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:45.770605 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:48.941183 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:48.941150 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:36:48.949585 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:48.949559 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:36:54.662208 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:54.662174 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:36:59.986185 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:36:59.986149 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:37:25.261889 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:25.261855 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:37:36.470328 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:36.470290 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wsqc4"] Apr 21 16:37:42.598799 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:42.598712 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7fc6cd87c-8c8v4_60cc832e-6b90-4416-baea-170c655b093c/maas-api/0.log" Apr 21 16:37:42.936235 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:42.936155 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-774f54dc87-59rrc_799c2b2f-4eab-4074-8c0f-e8e634e28877/manager/0.log" Apr 21 16:37:44.884563 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:44.884535 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-xm5tt_375b55eb-6f78-449f-a34a-d4fe300b233a/kuadrant-console-plugin/0.log" Apr 21 16:37:45.225710 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:45.225634 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-wsqc4_cc2acb96-33c6-4635-b84c-e3800e4f8460/limitador/0.log" Apr 21 16:37:46.027156 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:46.027127 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5598cc66fd-wwcdm_64d99728-f189-4ea9-9c45-de36d549fbae/kube-auth-proxy/0.log" Apr 21 16:37:50.728851 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.728813 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zttst/must-gather-zg6sr"] Apr 21 16:37:50.732126 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.732111 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:50.734950 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.734923 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zttst\"/\"openshift-service-ca.crt\"" Apr 21 16:37:50.735816 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.735777 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zttst\"/\"kube-root-ca.crt\"" Apr 21 16:37:50.736878 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.736860 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zttst\"/\"default-dockercfg-568sq\"" Apr 21 16:37:50.760858 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.760836 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/must-gather-zg6sr"] Apr 21 16:37:50.816801 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.816763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/afcdb09b-6b1d-4947-9449-6a0c5ced6d65-must-gather-output\") pod \"must-gather-zg6sr\" (UID: \"afcdb09b-6b1d-4947-9449-6a0c5ced6d65\") " pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:50.816899 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.816807 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhrng\" (UniqueName: \"kubernetes.io/projected/afcdb09b-6b1d-4947-9449-6a0c5ced6d65-kube-api-access-qhrng\") pod \"must-gather-zg6sr\" (UID: \"afcdb09b-6b1d-4947-9449-6a0c5ced6d65\") " pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:50.917682 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.917655 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/afcdb09b-6b1d-4947-9449-6a0c5ced6d65-must-gather-output\") pod \"must-gather-zg6sr\" (UID: \"afcdb09b-6b1d-4947-9449-6a0c5ced6d65\") " pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:50.917682 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.917684 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhrng\" (UniqueName: \"kubernetes.io/projected/afcdb09b-6b1d-4947-9449-6a0c5ced6d65-kube-api-access-qhrng\") pod \"must-gather-zg6sr\" (UID: \"afcdb09b-6b1d-4947-9449-6a0c5ced6d65\") " pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:50.918101 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.918080 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/afcdb09b-6b1d-4947-9449-6a0c5ced6d65-must-gather-output\") pod \"must-gather-zg6sr\" (UID: \"afcdb09b-6b1d-4947-9449-6a0c5ced6d65\") " pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:50.929179 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:50.929154 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhrng\" (UniqueName: \"kubernetes.io/projected/afcdb09b-6b1d-4947-9449-6a0c5ced6d65-kube-api-access-qhrng\") pod \"must-gather-zg6sr\" (UID: \"afcdb09b-6b1d-4947-9449-6a0c5ced6d65\") " pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:51.040746 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:51.040724 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/must-gather-zg6sr" Apr 21 16:37:51.366234 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:51.366212 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/must-gather-zg6sr"] Apr 21 16:37:51.369108 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:37:51.369077 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafcdb09b_6b1d_4947_9449_6a0c5ced6d65.slice/crio-fd5164871e38e2c4d2f561cfb660d19f1384ea2cb3a74e1cfe49fd2bfd56be53 WatchSource:0}: Error finding container fd5164871e38e2c4d2f561cfb660d19f1384ea2cb3a74e1cfe49fd2bfd56be53: Status 404 returned error can't find the container with id fd5164871e38e2c4d2f561cfb660d19f1384ea2cb3a74e1cfe49fd2bfd56be53 Apr 21 16:37:51.370828 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:51.370808 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:37:52.005185 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:52.005144 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/must-gather-zg6sr" event={"ID":"afcdb09b-6b1d-4947-9449-6a0c5ced6d65","Type":"ContainerStarted","Data":"fd5164871e38e2c4d2f561cfb660d19f1384ea2cb3a74e1cfe49fd2bfd56be53"} Apr 21 16:37:53.010341 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:53.010299 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/must-gather-zg6sr" event={"ID":"afcdb09b-6b1d-4947-9449-6a0c5ced6d65","Type":"ContainerStarted","Data":"6c1f10a49c8e64988a0aae140ae45d68290100c48c54e73dc9b11e251f1dd3c6"} Apr 21 16:37:53.010341 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:53.010346 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/must-gather-zg6sr" event={"ID":"afcdb09b-6b1d-4947-9449-6a0c5ced6d65","Type":"ContainerStarted","Data":"5c1b2c6ce66f563a911920a835fe415a93120dafc5376510c03aa011413f86ec"} Apr 21 16:37:53.031246 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:53.031198 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zttst/must-gather-zg6sr" podStartSLOduration=2.313723752 podStartE2EDuration="3.031186417s" podCreationTimestamp="2026-04-21 16:37:50 +0000 UTC" firstStartedPulling="2026-04-21 16:37:51.370940939 +0000 UTC m=+2163.115878724" lastFinishedPulling="2026-04-21 16:37:52.088403604 +0000 UTC m=+2163.833341389" observedRunningTime="2026-04-21 16:37:53.029516823 +0000 UTC m=+2164.774454628" watchObservedRunningTime="2026-04-21 16:37:53.031186417 +0000 UTC m=+2164.776124217" Apr 21 16:37:53.997603 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:53.997572 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wpnfj_427cd153-57f3-494e-8f29-f4e3e984756d/global-pull-secret-syncer/0.log" Apr 21 16:37:54.146373 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:54.146330 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7n47b_a844703d-9a8a-4877-a840-e850e06f82b0/konnectivity-agent/0.log" Apr 21 16:37:54.242835 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:54.242804 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-158.ec2.internal_ee115be6bbf3231206ae6c74733c2779/haproxy/0.log" Apr 21 16:37:58.646405 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:58.646372 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-xm5tt_375b55eb-6f78-449f-a34a-d4fe300b233a/kuadrant-console-plugin/0.log" Apr 21 16:37:58.847965 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:37:58.847839 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-wsqc4_cc2acb96-33c6-4635-b84c-e3800e4f8460/limitador/0.log" Apr 21 16:38:00.533070 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.533035 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/alertmanager/0.log" Apr 21 16:38:00.568584 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.568516 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/config-reloader/0.log" Apr 21 16:38:00.598501 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.598435 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/kube-rbac-proxy-web/0.log" Apr 21 16:38:00.626101 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.626074 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/kube-rbac-proxy/0.log" Apr 21 16:38:00.650556 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.650485 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/kube-rbac-proxy-metric/0.log" Apr 21 16:38:00.694764 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.694687 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/prom-label-proxy/0.log" Apr 21 16:38:00.733844 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.733814 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e59172a8-fd83-4930-9415-2cc933f5953b/init-config-reloader/0.log" Apr 21 16:38:00.952559 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:00.952484 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5bfd5c857b-fhtvj_f90b4ba6-859d-43df-8063-4b7311a0faaa/metrics-server/0.log" Apr 21 16:38:01.022014 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.021978 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jbm6m_72dfee7f-8b16-43e2-860b-0ef8b4a63261/node-exporter/0.log" Apr 21 16:38:01.048562 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.048535 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jbm6m_72dfee7f-8b16-43e2-860b-0ef8b4a63261/kube-rbac-proxy/0.log" Apr 21 16:38:01.074744 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.074706 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jbm6m_72dfee7f-8b16-43e2-860b-0ef8b4a63261/init-textfile/0.log" Apr 21 16:38:01.340338 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.340280 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/prometheus/0.log" Apr 21 16:38:01.371099 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.371073 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/config-reloader/0.log" Apr 21 16:38:01.394713 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.394680 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/thanos-sidecar/0.log" Apr 21 16:38:01.417651 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.417624 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/kube-rbac-proxy-web/0.log" Apr 21 16:38:01.437491 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.437455 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/kube-rbac-proxy/0.log" Apr 21 16:38:01.458257 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.458232 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/kube-rbac-proxy-thanos/0.log" Apr 21 16:38:01.480744 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.480716 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e491dc9e-c581-4b95-a000-22e7c59757a7/init-config-reloader/0.log" Apr 21 16:38:01.556462 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:01.556433 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-8b82d_a2c7f21d-b221-4b60-8736-1cf4fb90d7eb/prometheus-operator-admission-webhook/0.log" Apr 21 16:38:02.148804 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.148357 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd"] Apr 21 16:38:02.155462 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.155433 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.161342 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.161311 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd"] Apr 21 16:38:02.233280 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.233235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-podres\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.233454 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.233297 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-lib-modules\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.233454 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.233369 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pwj\" (UniqueName: \"kubernetes.io/projected/4b6c45dd-2934-4aed-b300-17a583abee68-kube-api-access-f5pwj\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.233454 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.233391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-sys\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.233454 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.233406 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-proc\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.333931 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.333890 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pwj\" (UniqueName: \"kubernetes.io/projected/4b6c45dd-2934-4aed-b300-17a583abee68-kube-api-access-f5pwj\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.333931 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.333933 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-sys\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334182 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.333953 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-proc\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334182 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.333983 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-podres\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334182 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.334029 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-lib-modules\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334182 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.334048 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-sys\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334182 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.334058 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-proc\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334182 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.334170 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-lib-modules\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.334485 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.334185 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b6c45dd-2934-4aed-b300-17a583abee68-podres\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.345364 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.345335 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pwj\" (UniqueName: \"kubernetes.io/projected/4b6c45dd-2934-4aed-b300-17a583abee68-kube-api-access-f5pwj\") pod \"perf-node-gather-daemonset-cv4nd\" (UID: \"4b6c45dd-2934-4aed-b300-17a583abee68\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.469170 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.469090 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:02.627181 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:02.627148 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd"] Apr 21 16:38:02.630318 ip-10-0-142-158 kubenswrapper[2562]: W0421 16:38:02.630261 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b6c45dd_2934_4aed_b300_17a583abee68.slice/crio-edead11c3142eedbe14ddc864e007e8db809a662ed6390c23495bcab878af7c4 WatchSource:0}: Error finding container edead11c3142eedbe14ddc864e007e8db809a662ed6390c23495bcab878af7c4: Status 404 returned error can't find the container with id edead11c3142eedbe14ddc864e007e8db809a662ed6390c23495bcab878af7c4 Apr 21 16:38:03.053437 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:03.053342 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" event={"ID":"4b6c45dd-2934-4aed-b300-17a583abee68","Type":"ContainerStarted","Data":"30e366ee2ff8778089624e06d5de5d8dd5b09a9fc96f7a93e5903b5817c3ec66"} Apr 21 16:38:03.053437 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:03.053389 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" event={"ID":"4b6c45dd-2934-4aed-b300-17a583abee68","Type":"ContainerStarted","Data":"edead11c3142eedbe14ddc864e007e8db809a662ed6390c23495bcab878af7c4"} Apr 21 16:38:03.053666 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:03.053546 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:03.088497 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:03.088453 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" podStartSLOduration=1.088437958 podStartE2EDuration="1.088437958s" podCreationTimestamp="2026-04-21 16:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:38:03.086473053 +0000 UTC m=+2174.831410859" watchObservedRunningTime="2026-04-21 16:38:03.088437958 +0000 UTC m=+2174.833375763" Apr 21 16:38:03.514174 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:03.514144 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/1.log" Apr 21 16:38:03.523352 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:03.523321 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hg85z_d142be23-d04a-4d93-a53c-ca2d3e8cd743/console-operator/2.log" Apr 21 16:38:04.035795 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:04.035758 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-sstpk_05d9b5c2-543c-4bc0-a92a-c8433467bc7a/download-server/0.log" Apr 21 16:38:04.570744 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:04.570713 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-qwrxl_f1504ffc-d02c-419c-92a1-e0f7dbab1932/volume-data-source-validator/0.log" Apr 21 16:38:05.393972 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:05.393935 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p8j5k_50caee65-e2ab-4233-a2b5-e5ea4a951bed/dns/0.log" Apr 21 16:38:05.414954 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:05.414926 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p8j5k_50caee65-e2ab-4233-a2b5-e5ea4a951bed/kube-rbac-proxy/0.log" Apr 21 16:38:05.435364 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:05.435342 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dmnb_1a05e8cf-847c-48cc-802b-171bcb5dea76/dns-node-resolver/0.log" Apr 21 16:38:05.947576 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:05.947546 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-d5497b59c-rnrbh_a7cca86f-2296-47d4-9cc4-403c90fded3d/registry/0.log" Apr 21 16:38:05.985316 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:05.985290 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-chrww_511124f1-f198-4d6c-9713-d6f1375957e5/node-ca/0.log" Apr 21 16:38:07.059956 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:07.059924 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5598cc66fd-wwcdm_64d99728-f189-4ea9-9c45-de36d549fbae/kube-auth-proxy/0.log" Apr 21 16:38:07.665537 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:07.665505 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lxgkn_8648a8db-b8ad-409e-ae80-85c058398baf/serve-healthcheck-canary/0.log" Apr 21 16:38:08.121622 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:08.121598 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r7jtg_be01fbb6-f686-41d2-aaa3-1abd80d94c27/insights-operator/0.log" Apr 21 16:38:08.122010 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:08.121675 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r7jtg_be01fbb6-f686-41d2-aaa3-1abd80d94c27/insights-operator/1.log" Apr 21 16:38:08.138879 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:08.138860 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4qvj_d84cf002-59f6-43e4-991d-f3cae3707de3/kube-rbac-proxy/0.log" Apr 21 16:38:08.157406 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:08.157386 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4qvj_d84cf002-59f6-43e4-991d-f3cae3707de3/exporter/0.log" Apr 21 16:38:08.175793 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:08.175763 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-h4qvj_d84cf002-59f6-43e4-991d-f3cae3707de3/extractor/0.log" Apr 21 16:38:09.069946 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:09.069918 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-cv4nd" Apr 21 16:38:10.298706 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:10.298671 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7fc6cd87c-8c8v4_60cc832e-6b90-4416-baea-170c655b093c/maas-api/0.log" Apr 21 16:38:10.434535 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:10.434508 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-774f54dc87-59rrc_799c2b2f-4eab-4074-8c0f-e8e634e28877/manager/0.log" Apr 21 16:38:12.030514 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:12.030484 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-d98d6987c-jlt6w_b6ae7e5a-34a0-497a-b449-df1e47f2575b/manager/0.log" Apr 21 16:38:16.494835 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:16.494805 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bqg6h_72e648ce-5738-4b50-b6fe-add8dfdcb823/migrator/0.log" Apr 21 16:38:16.521738 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:16.521714 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-bqg6h_72e648ce-5738-4b50-b6fe-add8dfdcb823/graceful-termination/0.log" Apr 21 16:38:16.878630 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:16.878554 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-994j7_c5d3a65e-5e28-4860-a01a-277b576a947b/kube-storage-version-migrator-operator/1.log" Apr 21 16:38:16.879412 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:16.879391 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-994j7_c5d3a65e-5e28-4860-a01a-277b576a947b/kube-storage-version-migrator-operator/0.log" Apr 21 16:38:18.084650 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.084624 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/kube-multus-additional-cni-plugins/0.log" Apr 21 16:38:18.106864 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.106839 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/egress-router-binary-copy/0.log" Apr 21 16:38:18.129750 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.129729 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/cni-plugins/0.log" Apr 21 16:38:18.154042 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.154015 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/bond-cni-plugin/0.log" Apr 21 16:38:18.178360 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.178341 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/routeoverride-cni/0.log" Apr 21 16:38:18.204414 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.204389 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/whereabouts-cni-bincopy/0.log" Apr 21 16:38:18.235755 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.235731 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bskpq_265548b5-1968-424e-850b-1b95c8e7798f/whereabouts-cni/0.log" Apr 21 16:38:18.446987 ip-10-0-142-158 kubenswrapper[2562]: I0421 16:38:18.446912 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fm4kz_c8f9fcd0-5378-4da1-a89a-2ffad35fe389/kube-multus/0.log"