cancel
Showing results for 
Search instead for 
Did you mean: 
intapiuser
Community Team Member
Community Team Member
In case you have a complex models in the system, you may sometimes experience widgets not loading because of an error:

What is the throttling queue?

The throttling queue is a queue in the query service which prevents bombarding of EC/Live with many queries, so the queries wait in queue for their turn of execution.

What is the size of this queue?

The default size of the queue id 400 queries and the queue is managed per data source. The default query execution concurrency is 8 queries per EC and 20 queries per Live

Why do widgets fail with this message?

If throttling queue grows it means that it takes time for the DB (EC or Live) to execute the queries. When the number of queries reaches the threshold set in:
get /Sisense/S1/configuration/production/query/ThrottlingQueueSize.value
The error message will show up, the widget will fail to load and no more queries will be added to the queue.

How do we see how many messages are in the queue?

  1. We can enable debug messages for the specific class (com.sisense.query.throttling):
si loggers set -service query -level DEBUG -logger com.sisense.query.throttling
We will then see messages such as:
BE#479095 Entering ThrottlingManager::acquire, id [{}], throttlingConcurrency [{}].
 
BE#846418 acquiring semaphore for execution
 
BE#735442 acquired semaphore
 
BE#250246 releasing semaphore
semaphore in this case is a queue. The way it works is that any arriving query enters the queue(acquiring semaphore message), sent for execution (acquired the semaphore) and it stays in the queue until it finishes to execute and is released from the queue (releasing the semaphore).
2. We can also see it in the Query log (concurrentQuery parameter):
{"Log_Type":"Structured","Log_Version":"2.0","Log_Message":"FinishQuery","Log_Level":"INFO","Log_Component":"query","Log_DateTime":"2020-11-04T14:39:18.156203","Log_Thread":"async41","appTypeName":"query","Log_TransactionID":"f186705c-c8a8-424b-8ca9-440ef7fbba75","LogMessage":"FinishQuery","JAQL Code":"AA962-648F-E3C1-F9D9-04B4-13C8-10AD-D8E6-9","JAQL Text":"{\"metadata\":[{\"jaql\":
{\"agg\":\"sum\",\"datatype\":\"numeric\",\"column\":\"Accepted Payment Tolerance\",\"dim\":\"[New Table Query.Accepted Payment Tolerance]\",\"title\":\"Total Accepted Payment Tolerance\",\"table\":\"New Table Query\"},\"format\":{\"color\":{\"color\":\"#00cee6\",\"type\":\"color\"},\"mask\":{\"decimals\":\"auto\",\"separated\":true,\"isdefault\":true,\"type\":\"number\",\"abbreviations\":{\"b\":true,\"t\":true,\"k\":false,\"m\":true}}},\"source\":\"value\"}],\"widget\":\"5f8eb463b6d4f3002f874ef4;\",\"offset\":0,\"datasource\":{\"fullname\":\"live:Mysql2\",\"id\":\"live:Mysql2\",\"title\":\"Mysql2\",\"live\":true},\"m2mThresholdFlag\":0,\"queryGuid\":\"AA962-648F-E3C1-F9D9-04B4-13C8-10AD-D8E69\",\"isMaskedResult\":true,\"format\":\"json\",\"count\":50000,\"dashboard\":\"5f8eb454b6d4f3002f874ef1;Mysql2\"}","status":"success","widget":"5f8eb463b6d4f3002f874ef4;","dashboard":"5f8eb454b6d4f3002f874ef1;Mysql2","queryGuid":"AA962-648F-E3C1-F9D9-04B4-13C8-10AD-D8E6-9","startQueryTimeStamp":"2020-11-04T14:39:16+0000","endQueryTimeStamp":"2020-11-04T14:39:18+0000","queryResponseLengthBytes":565,"numberOfColumns":1,"numberOfRows":1,"duration":1.872,"queryType":"jaql","cubeName":"Mysql2","cubeID":"Mysql2","action":"finishQueryJaql","userName":"5f687c9e13cc6b001a90ee2e","querySource":"Live","querySourceLiveProvider":"sql","throttlingTimeWaiting":0.044,"concurrentQuery":1,"fromCache":false,"translationServiceProvider":"TranslationService","connectorArch":"new","translationAttempts":1}

How can we resolve the issue?

A few possibilities:
  • Review the queries and pay attention for very large ones and ones that take a long time to respond using the query logs. 
  • The Data Groups CPU and RAM limits are too low for the ElastiCube. Therefore the ElastiCube is unable to process all requests and queries pile up in the throttling queue until they reach the configured limit.
  • Review the data model. Do you have a complex model? Are you joining tables with many columns?
Rate this article:
Version history
Last update:
‎03-02-2023 08:53 AM
Updated by:
Contributors