Artur Hefczyc opened 1 decade ago
|
|
There is new ad-hoc command "Load Test" in PubSub. |
|
Thank you Bartek. Eric, please check this out and let us know if there are any problems. |
|
Sending to node "generated-10-0" using default values produces the following:
When I change the frequency to value 10000 the following happens:
Sending to node "generated-2000-0" using default values produces the following:
|
|
fixed in tigase-utils |
|
Bartek: does this look right to you? I've just run 4 tests with increasing publishing frequency msg/min. Not sure if this is causing problem:
All these tests are run with time: 60. The frequencies were 60, 120, 240, and 100. The packet count seems to be getting less instead of growing as expected. Attached a ss of packets sent window. |
|
Fixed and commited to pubsub repository. |
|
Bartek: Changing the frequency doesn't seem to have any effect on the tests. I ran 4 tests with frequencies 60, 600, 6000, and 6000000. The data looked similar on each run.I've attached a screenshot. I also ran two separate tests of 60, and 6000000 with truncated logfiles and there isn't a noticable difference in what was logged.
Attaching the final two statistics logs for these runs 60, and 6000000 frequency. |
|
Hi Bartek. Not sure if you saw my xmpp messages. The new code seems to be ignoring the rate set in the variables. I set time 60, Frequency 2200, Size 1024 sending to 2000 offline subscribers. I've tried tests with and without ticking the non-blocking box. In both tests I was delivered way more than the requested 2200 items. The last test sent me over 20,000 items before Psi locked up. Packet count went to 170,000/sec and remained there for 20 minutes. pubsub/Last second packets 0 pubsub/Last minute packets 0 pubsub/Last hour packets 0 pubsub/Processing threads 48 pubsub/Packets received 204046 pubsub/Packets sent 203305958 pubsub/In queue wait: SYSTEM 0 pubsub/In queue wait: CLUSTER 0 pubsub/In queue wait: HIGH 0 pubsub/In queue wait: NORMAL 0 pubsub/In queue wait: LOW 0 pubsub/In queue wait: PRESENCE 0 pubsub/In queue wait: LOWEST 0 pubsub/Out queue wait: SYSTEM 0 pubsub/Out queue wait: CLUSTER 0 pubsub/Out queue wait: HIGH 0 pubsub/Out queue wait: NORMAL 0 pubsub/Out queue wait: LOW 0 pubsub/Out queue wait: PRESENCE 0 pubsub/Out queue wait: LOWEST 0 pubsub/Total In queues wait 0 pubsub/Total Out queues wait 0 pubsub/Total queues wait 0 pubsub/Max queue size 104352 pubsub/IN Queue overflow 0 pubsub/OUT Queue overflow 0 pubsub/Total queues overflow 0 pubsub/Average processing time on last 100 runs [ms] 0 pubsub/IN_QUEUE processed messages 0 pubsub/IN_QUEUE processed presences 2 pubsub/IN_QUEUE processed cluster 0 pubsub/IN_QUEUE processed other 204039 pubsub/IN_QUEUE processed IQ no XMLNS 0 pubsub/IN_QUEUE processed IQ http://jabber.org/protocol/disco#items 1 pubsub/IN_QUEUE processed IQ command 2 pubsub/IN_QUEUE processed IQ pubsub 204042 pubsub/IN_QUEUE processed total IQ 204045 pubsub/OUT_QUEUE processed messages 202966367 pubsub/OUT_QUEUE processed presences 1 pubsub/OUT_QUEUE processed cluster 0 pubsub/OUT_QUEUE processed other 204005 pubsub/OUT_QUEUE processed IQ no XMLNS 0 pubsub/OUT_QUEUE processed IQ http://jabber.org/protocol/disco#items 1 pubsub/OUT_QUEUE processed IQ command 2 pubsub/OUT_QUEUE processed IQ pubsub 204042 pubsub/OUT_QUEUE processed total IQ 204045 pubsub/Processed packets thread: in_0-pubsub 4275 pubsub/Processed packets thread: in_1-pubsub 4278 pubsub/Processed packets thread: in_2-pubsub 4173 pubsub/Processed packets thread: in_3-pubsub 4199 pubsub/Processed packets thread: in_4-pubsub 4383 pubsub/Processed packets thread: in_5-pubsub 4243 pubsub/Processed packets thread: in_6-pubsub 4285 pubsub/Processed packets thread: in_7-pubsub 4139 pubsub/Processed packets thread: in_8-pubsub 4227 pubsub/Processed packets thread: in_9-pubsub 4333 pubsub/Processed packets thread: in_10-pubsub 4367 pubsub/Processed packets thread: in_11-pubsub 4293 pubsub/Processed packets thread: in_12-pubsub 4254 pubsub/Processed packets thread: in_13-pubsub 4305 pubsub/Processed packets thread: in_14-pubsub 4306 pubsub/Processed packets thread: in_15-pubsub 4202 pubsub/Processed packets thread: in_16-pubsub 4334 pubsub/Processed packets thread: in_17-pubsub 4238 pubsub/Processed packets thread: in_18-pubsub 4271 pubsub/Processed packets thread: in_19-pubsub 4330 pubsub/Processed packets thread: in_20-pubsub 4231 pubsub/Processed packets thread: in_21-pubsub 4110 pubsub/Processed packets thread: in_22-pubsub 4222 pubsub/Processed packets thread: in_23-pubsub 4236 pubsub/Processed packets thread: in_24-pubsub 4145 pubsub/Processed packets thread: in_25-pubsub 4272 pubsub/Processed packets thread: in_26-pubsub 4217 pubsub/Processed packets thread: in_27-pubsub 4373 pubsub/Processed packets thread: in_28-pubsub 4178 pubsub/Processed packets thread: in_29-pubsub 4294 pubsub/Processed packets thread: in_30-pubsub 4270 pubsub/Processed packets thread: in_31-pubsub 4259 pubsub/Processed packets thread: in_32-pubsub 4207 pubsub/Processed packets thread: in_33-pubsub 4174 pubsub/Processed packets thread: in_34-pubsub 4161 pubsub/Processed packets thread: in_35-pubsub 4318 pubsub/Processed packets thread: in_36-pubsub 4192 pubsub/Processed packets thread: in_37-pubsub 4242 pubsub/Processed packets thread: in_38-pubsub 4237 pubsub/Processed packets thread: in_39-pubsub 4211 pubsub/Processed packets thread: in_40-pubsub 4347 pubsub/Processed packets thread: in_41-pubsub 4337 pubsub/Processed packets thread: in_42-pubsub 4221 pubsub/Processed packets thread: in_43-pubsub 4308 pubsub/Processed packets thread: in_44-pubsub 4253 pubsub/Processed packets thread: in_45-pubsub 4216 pubsub/Processed packets thread: in_46-pubsub 4156 pubsub/Processed packets thread: in_47-pubsub 4225 pubsub/Processed packets thread: out_0-pubsub 4256776 pubsub/Processed packets thread: out_1-pubsub 4256739 pubsub/Processed packets thread: out_2-pubsub 4256764 pubsub/Processed packets thread: out_3-pubsub 4255522 pubsub/Processed packets thread: out_4-pubsub 4258403 pubsub/Processed packets thread: out_5-pubsub 4257707 pubsub/Processed packets thread: out_6-pubsub 4254324 pubsub/Processed packets thread: out_7-pubsub 4258250 pubsub/Processed packets thread: out_8-pubsub 4257112 pubsub/Processed packets thread: out_9-pubsub 4257199 pubsub/Processed packets thread: out_10-pubsub 4258413 pubsub/Processed packets thread: out_11-pubsub 4257751 pubsub/Processed packets thread: out_12-pubsub 4256171 pubsub/Processed packets thread: out_13-pubsub 4259389 pubsub/Processed packets thread: out_14-pubsub 4253505 pubsub/Processed packets thread: out_15-pubsub 4255228 pubsub/Processed packets thread: out_16-pubsub 4261113 pubsub/Processed packets thread: out_17-pubsub 4256117 pubsub/Processed packets thread: out_18-pubsub 4254698 pubsub/Processed packets thread: out_19-pubsub 4257750 pubsub/Processed packets thread: out_20-pubsub 4259076 pubsub/Processed packets thread: out_21-pubsub 4259163 pubsub/Processed packets thread: out_22-pubsub 4258605 pubsub/Processed packets thread: out_23-pubsub 4256402 pubsub/Processed packets thread: out_24-pubsub 4257894 pubsub/Processed packets thread: out_25-pubsub 4258782 pubsub/Processed packets thread: out_26-pubsub 4253540 pubsub/Processed packets thread: out_27-pubsub 4256721 pubsub/Processed packets thread: out_28-pubsub 4258271 pubsub/Processed packets thread: out_29-pubsub 4256795 pubsub/Processed packets thread: out_30-pubsub 4254666 pubsub/Processed packets thread: out_31-pubsub 4261489 pubsub/Processed packets thread: out_32-pubsub 4260834 pubsub/Processed packets thread: out_33-pubsub 4259946 pubsub/Processed packets thread: out_34-pubsub 4255380 pubsub/Processed packets thread: out_35-pubsub 4255495 pubsub/Processed packets thread: out_36-pubsub 4257179 pubsub/Processed packets thread: out_37-pubsub 4257184 pubsub/Processed packets thread: out_38-pubsub 4256508 pubsub/Processed packets thread: out_39-pubsub 4258539 pubsub/Processed packets thread: out_40-pubsub 4256667 pubsub/Processed packets thread: out_41-pubsub 4255784 pubsub/Processed packets thread: out_42-pubsub 4260476 pubsub/Processed packets thread: out_43-pubsub 4260423 pubsub/Processed packets thread: out_44-pubsub 4257343 pubsub/Processed packets thread: out_45-pubsub 4252923 pubsub/Processed packets thread: out_46-pubsub 4256271 pubsub/Processed packets thread: out_47-pubsub 4256780 |
|
I just checked code and it seems it looks good. I added some logs to check with what parameters LoadTest is started and how many items it published. |
|
Here I ran the tests using 50, 100, 150, and 200 per second. Next I wanted to see max. I used 500, 1000, and 1500 per second. Here are the final stats from these last 3 tests.
|
|
Here's the test again with tweaked init.properties. 500, 1000, and 1500. |
|
Form the comments I get it is completed now. I am closing the ticket and moving it to PubSub project. |
Type |
Task
|
Priority |
Blocker
|
Assignee | |
RedmineID |
2289
|
Spent time |
6h
|
We need pubsub publisher code inside Tigase:
The code is activated using an ad-hoc command which allows you to set publishing frequency, time of the test, published messages size and PubSub node ID
The code generating new publications cannot run within the ad-hoc command as Groovy is too slow
The publishing code can be implemented either inside PubSub or some other (Test?) component
2014-09-24_161241_1_PubSub Packets Received-Sent, Total_ 0, Max_ 107.png 2014-09-28_203123_1_PubSub Packets Received-Sent, Total_ 0, Max_ 769.png frequency60.log frequency6000000.log 2014-09-30_111816_1_PubSub Packets Received-Sent, Total_ 0, Max_ 176906.png pubsub-a.png pubsub-b.png pubsub-c.png