Projects tigase _server tigase-pubsub Issues #32
PubSub publisher code (#32)
Artur Hefczyc opened 1 decade ago
Due Date
2016-04-03

We need pubsub publisher code inside Tigase:

  1. The code is activated using an ad-hoc command which allows you to set publishing frequency, time of the test, published messages size and PubSub node ID

  2. The code generating new publications cannot run within the ad-hoc command as Groovy is too slow

  3. The publishing code can be implemented either inside PubSub or some other (Test?) component

2014-09-24_161241_1_PubSub Packets Received-Sent, Total_ 0, Max_ 107.png 2014-09-28_203123_1_PubSub Packets Received-Sent, Total_ 0, Max_ 769.png frequency60.log frequency6000000.log 2014-09-30_111816_1_PubSub Packets Received-Sent, Total_ 0, Max_ 176906.png pubsub-a.png pubsub-b.png pubsub-c.png

  • Bartosz Małkowski commented 1 decade ago

    There is new ad-hoc command "Load Test" in PubSub.

  • Artur Hefczyc commented 1 decade ago

    Thank you Bartek.

    Eric, please check this out and let us know if there are any problems.

  • Eric Dziewa commented 1 decade ago

    Sending to node "generated-10-0" using default values produces the following:

    java.lang.NullPointerException: Neither attribute key or value can be set to null. Attribute: type, value: null.
    
            at tigase.xml.Element$XMLIdentityHashMap.put(Element.java:1309)
            at tigase.xml.Element.setAttribute(Element.java:1026)
            at tigase.form.Form.getElement(Form.java:309)
            at tigase.pubsub.modules.commands.LoadTestCommand.execute(LoadTestCommand.java:73)
            at tigase.adhoc.AdHocCommandManager.process(AdHocCommandManager.java:91)
            at tigase.pubsub.modules.AdHocConfigCommandModule.process(AdHocConfigCommandModule.java:88)
            at tigase.component2.modules.ModulesManager.process(ModulesManager.java:87)
            at tigase.component2.AbstractComponent.processStanzaPacket(AbstractComponent.java:209)
            at tigase.component2.AbstractComponent.processPacket(AbstractComponent.java:197)
            at tigase.pubsub.PubSubComponent.processPacket(PubSubComponent.java:637)
            at tigase.server.AbstractMessageReceiver$QueueListener.run(AbstractMessageReceiver.java:1490)
    62
    

    When I change the frequency to value 10000 the following happens:

    java.lang.NullPointerException: Neither attribute key or value can be set to null. Attribute: type, value: null.
    
            at tigase.xml.Element$XMLIdentityHashMap.put(Element.java:1309)
            at tigase.xml.Element.setAttribute(Element.java:1026)
            at tigase.form.Form.getElement(Form.java:309)
            at tigase.pubsub.modules.commands.LoadTestCommand.execute(LoadTestCommand.java:73)
            at tigase.adhoc.AdHocCommandManager.process(AdHocCommandManager.java:91)
            at tigase.pubsub.modules.AdHocConfigCommandModule.process(AdHocConfigCommandModule.java:88)
            at tigase.component2.modules.ModulesManager.process(ModulesManager.java:87)
            at tigase.component2.AbstractComponent.processStanzaPacket(AbstractComponent.java:209)
            at tigase.component2.AbstractComponent.processPacket(AbstractComponent.java:197)
            at tigase.pubsub.PubSubComponent.processPacket(PubSubComponent.java:637)
            at tigase.server.AbstractMessageReceiver$QueueListener.run(AbstractMessageReceiver.java:1490)
    2014-09-24 11:17:36.226 [Thread-1710]      LoadTestGenerator.run()            WARNING:  LoadTest generator stopped
    java.lang.IllegalArgumentException: timeout value is negative
            at java.lang.Thread.sleep(Native Method)
            at tigase.pubsub.modules.commands.LoadTestGenerator.run(LoadTestGenerator.java:70)
            at java.lang.Thread.run(Thread.java:745)
    

    Sending to node "generated-2000-0" using default values produces the following:

    java.lang.NullPointerException: Neither attribute key or value can be set to null. Attribute: type, value: null.
    
            at tigase.xml.Element$XMLIdentityHashMap.put(Element.java:1309)
            at tigase.xml.Element.setAttribute(Element.java:1026)
            at tigase.form.Form.getElement(Form.java:309)
            at tigase.pubsub.modules.commands.LoadTestCommand.execute(LoadTestCommand.java:73)
            at tigase.adhoc.AdHocCommandManager.process(AdHocCommandManager.java:91)
            at tigase.pubsub.modules.AdHocConfigCommandModule.process(AdHocConfigCommandModule.java:88)
            at tigase.component2.modules.ModulesManager.process(ModulesManager.java:87)
            at tigase.component2.AbstractComponent.processStanzaPacket(AbstractComponent.java:209)
            at tigase.component2.AbstractComponent.processPacket(AbstractComponent.java:197)
            at tigase.pubsub.PubSubComponent.processPacket(PubSubComponent.java:637)
            at tigase.server.AbstractMessageReceiver$QueueListener.run(AbstractMessageReceiver.java:1490)
    2014-09-24 11:09:12.671 [Thread-1690]      LoadTestGenerator.run()            WARNING:  LoadTest generator stopped
    java.lang.IllegalArgumentException: timeout value is negative
            at java.lang.Thread.sleep(Native Method)
            at tigase.pubsub.modules.commands.LoadTestGenerator.run(LoadTestGenerator.java:70)
            at java.lang.Thread.run(Thread.java:745)
    
  • Bartosz Małkowski commented 1 decade ago

    fixed in tigase-utils

  • Eric Dziewa commented 1 decade ago

    Bartek: does this look right to you?

    I've just run 4 tests with increasing publishing frequency msg/min.

    Not sure if this is causing problem:

    2014-09-24 15:55:52.029 [Thread-1464]      LoadTestGenerator.run()            WARNING:  LoadTest generator stopped
    java.lang.IllegalArgumentException: timeout value is negative
            at java.lang.Thread.sleep(Native Method)
            at tigase.pubsub.modules.commands.LoadTestGenerator.run(LoadTestGenerator.java:70)
            at java.lang.Thread.run(Thread.java:745)
    .
    2014-09-24 16:01:37.098 [Thread-1477]      LoadTestGenerator.run()            WARNING:  LoadTest generator stopped
    java.lang.IllegalArgumentException: timeout value is negative
            at java.lang.Thread.sleep(Native Method)
            at tigase.pubsub.modules.commands.LoadTestGenerator.run(LoadTestGenerator.java:70)
            at java.lang.Thread.run(Thread.java:745)
    .
    2014-09-24 16:04:53.947 [Thread-1485]      LoadTestGenerator.run()            WARNING:  LoadTest generator stopped
    java.lang.IllegalArgumentException: timeout value is negative
            at java.lang.Thread.sleep(Native Method)
            at tigase.pubsub.modules.commands.LoadTestGenerator.run(LoadTestGenerator.java:70)
            at java.lang.Thread.run(Thread.java:745)
    .
    2014-09-24 16:08:13.988 [Thread-1492]      LoadTestGenerator.run()            WARNING:  LoadTest generator stopped
    java.lang.IllegalArgumentException: timeout value is negative
            at java.lang.Thread.sleep(Native Method)
            at tigase.pubsub.modules.commands.LoadTestGenerator.run(LoadTestGenerator.java:70)
            at java.lang.Thread.run(Thread.java:745)
    

    All these tests are run with time: 60. The frequencies were 60, 120, 240, and 100. The packet count seems to be getting less instead of growing as expected.

    Attached a ss of packets sent window.

  • Bartosz Małkowski commented 1 decade ago

    Fixed and commited to pubsub repository.

  • Eric Dziewa commented 1 decade ago

    Bartek: Changing the frequency doesn't seem to have any effect on the tests. I ran 4 tests with frequencies 60, 600, 6000, and 6000000. The data looked similar on each run.I've attached a screenshot. I also ran two separate tests of 60, and 6000000 with truncated logfiles and there isn't a noticable difference in what was logged.

    60:
    total 48804
    -rw-rw-r-- 1 tigase tigase   101004 Sep 28 20:46 tigase-console.log
    -rw-rw-r-- 1 tigase tigase  9862068 Sep 28 20:42 tigase.log.0
    -rw-rw-r-- 1 tigase tigase 10000408 Sep 28 20:42 tigase.log.1
    -rw-rw-r-- 1 tigase tigase 10000113 Sep 28 20:42 tigase.log.2
    -rw-rw-r-- 1 tigase tigase 10000449 Sep 28 20:42 tigase.log.3
    -rw-rw-r-- 1 tigase tigase 10000983 Sep 28 20:42 tigase.log.4
    
    6000000:
    total 39824
    -rw-rw-r-- 1 tigase tigase   100978 Sep 28 20:55 tigase-console.log
    -rw-rw-r-- 1 tigase tigase   665969 Sep 28 20:51 tigase.log.0
    -rw-rw-r-- 1 tigase tigase 10000128 Sep 28 20:51 tigase.log.1
    -rw-rw-r-- 1 tigase tigase 10000315 Sep 28 20:51 tigase.log.2
    -rw-rw-r-- 1 tigase tigase 10000005 Sep 28 20:51 tigase.log.3
    -rw-rw-r-- 1 tigase tigase 10000141 Sep 28 20:51 tigase.log.4
    

    Attaching the final two statistics logs for these runs 60, and 6000000 frequency.

  • Eric Dziewa commented 1 decade ago

    Hi Bartek. Not sure if you saw my xmpp messages. The new code seems to be ignoring the rate set in the variables. I set time 60, Frequency 2200, Size 1024 sending to 2000 offline subscribers. I've tried tests with and without ticking the non-blocking box. In both tests I was delivered way more than the requested 2200 items. The last test sent me over 20,000 items before Psi locked up. Packet count went to 170,000/sec and remained there for 20 minutes.

    pubsub/Last second packets 0

    pubsub/Last minute packets 0

    pubsub/Last hour packets 0

    pubsub/Processing threads 48

    pubsub/Packets received 204046

    pubsub/Packets sent 203305958

    pubsub/In queue wait: SYSTEM 0

    pubsub/In queue wait: CLUSTER 0

    pubsub/In queue wait: HIGH 0

    pubsub/In queue wait: NORMAL 0

    pubsub/In queue wait: LOW 0

    pubsub/In queue wait: PRESENCE 0

    pubsub/In queue wait: LOWEST 0

    pubsub/Out queue wait: SYSTEM 0

    pubsub/Out queue wait: CLUSTER 0

    pubsub/Out queue wait: HIGH 0

    pubsub/Out queue wait: NORMAL 0

    pubsub/Out queue wait: LOW 0

    pubsub/Out queue wait: PRESENCE 0

    pubsub/Out queue wait: LOWEST 0

    pubsub/Total In queues wait 0

    pubsub/Total Out queues wait 0

    pubsub/Total queues wait 0

    pubsub/Max queue size 104352

    pubsub/IN Queue overflow 0

    pubsub/OUT Queue overflow 0

    pubsub/Total queues overflow 0

    pubsub/Average processing time on last 100 runs [ms] 0

    pubsub/IN_QUEUE processed messages 0

    pubsub/IN_QUEUE processed presences 2

    pubsub/IN_QUEUE processed cluster 0

    pubsub/IN_QUEUE processed other 204039

    pubsub/IN_QUEUE processed IQ no XMLNS 0

    pubsub/IN_QUEUE processed IQ http://jabber.org/protocol/disco#items 1

    pubsub/IN_QUEUE processed IQ command 2

    pubsub/IN_QUEUE processed IQ pubsub 204042

    pubsub/IN_QUEUE processed total IQ 204045

    pubsub/OUT_QUEUE processed messages 202966367

    pubsub/OUT_QUEUE processed presences 1

    pubsub/OUT_QUEUE processed cluster 0

    pubsub/OUT_QUEUE processed other 204005

    pubsub/OUT_QUEUE processed IQ no XMLNS 0

    pubsub/OUT_QUEUE processed IQ http://jabber.org/protocol/disco#items 1

    pubsub/OUT_QUEUE processed IQ command 2

    pubsub/OUT_QUEUE processed IQ pubsub 204042

    pubsub/OUT_QUEUE processed total IQ 204045

    pubsub/Processed packets thread: in_0-pubsub 4275

    pubsub/Processed packets thread: in_1-pubsub 4278

    pubsub/Processed packets thread: in_2-pubsub 4173

    pubsub/Processed packets thread: in_3-pubsub 4199

    pubsub/Processed packets thread: in_4-pubsub 4383

    pubsub/Processed packets thread: in_5-pubsub 4243

    pubsub/Processed packets thread: in_6-pubsub 4285

    pubsub/Processed packets thread: in_7-pubsub 4139

    pubsub/Processed packets thread: in_8-pubsub 4227

    pubsub/Processed packets thread: in_9-pubsub 4333

    pubsub/Processed packets thread: in_10-pubsub 4367

    pubsub/Processed packets thread: in_11-pubsub 4293

    pubsub/Processed packets thread: in_12-pubsub 4254

    pubsub/Processed packets thread: in_13-pubsub 4305

    pubsub/Processed packets thread: in_14-pubsub 4306

    pubsub/Processed packets thread: in_15-pubsub 4202

    pubsub/Processed packets thread: in_16-pubsub 4334

    pubsub/Processed packets thread: in_17-pubsub 4238

    pubsub/Processed packets thread: in_18-pubsub 4271

    pubsub/Processed packets thread: in_19-pubsub 4330

    pubsub/Processed packets thread: in_20-pubsub 4231

    pubsub/Processed packets thread: in_21-pubsub 4110

    pubsub/Processed packets thread: in_22-pubsub 4222

    pubsub/Processed packets thread: in_23-pubsub 4236

    pubsub/Processed packets thread: in_24-pubsub 4145

    pubsub/Processed packets thread: in_25-pubsub 4272

    pubsub/Processed packets thread: in_26-pubsub 4217

    pubsub/Processed packets thread: in_27-pubsub 4373

    pubsub/Processed packets thread: in_28-pubsub 4178

    pubsub/Processed packets thread: in_29-pubsub 4294

    pubsub/Processed packets thread: in_30-pubsub 4270

    pubsub/Processed packets thread: in_31-pubsub 4259

    pubsub/Processed packets thread: in_32-pubsub 4207

    pubsub/Processed packets thread: in_33-pubsub 4174

    pubsub/Processed packets thread: in_34-pubsub 4161

    pubsub/Processed packets thread: in_35-pubsub 4318

    pubsub/Processed packets thread: in_36-pubsub 4192

    pubsub/Processed packets thread: in_37-pubsub 4242

    pubsub/Processed packets thread: in_38-pubsub 4237

    pubsub/Processed packets thread: in_39-pubsub 4211

    pubsub/Processed packets thread: in_40-pubsub 4347

    pubsub/Processed packets thread: in_41-pubsub 4337

    pubsub/Processed packets thread: in_42-pubsub 4221

    pubsub/Processed packets thread: in_43-pubsub 4308

    pubsub/Processed packets thread: in_44-pubsub 4253

    pubsub/Processed packets thread: in_45-pubsub 4216

    pubsub/Processed packets thread: in_46-pubsub 4156

    pubsub/Processed packets thread: in_47-pubsub 4225

    pubsub/Processed packets thread: out_0-pubsub 4256776

    pubsub/Processed packets thread: out_1-pubsub 4256739

    pubsub/Processed packets thread: out_2-pubsub 4256764

    pubsub/Processed packets thread: out_3-pubsub 4255522

    pubsub/Processed packets thread: out_4-pubsub 4258403

    pubsub/Processed packets thread: out_5-pubsub 4257707

    pubsub/Processed packets thread: out_6-pubsub 4254324

    pubsub/Processed packets thread: out_7-pubsub 4258250

    pubsub/Processed packets thread: out_8-pubsub 4257112

    pubsub/Processed packets thread: out_9-pubsub 4257199

    pubsub/Processed packets thread: out_10-pubsub 4258413

    pubsub/Processed packets thread: out_11-pubsub 4257751

    pubsub/Processed packets thread: out_12-pubsub 4256171

    pubsub/Processed packets thread: out_13-pubsub 4259389

    pubsub/Processed packets thread: out_14-pubsub 4253505

    pubsub/Processed packets thread: out_15-pubsub 4255228

    pubsub/Processed packets thread: out_16-pubsub 4261113

    pubsub/Processed packets thread: out_17-pubsub 4256117

    pubsub/Processed packets thread: out_18-pubsub 4254698

    pubsub/Processed packets thread: out_19-pubsub 4257750

    pubsub/Processed packets thread: out_20-pubsub 4259076

    pubsub/Processed packets thread: out_21-pubsub 4259163

    pubsub/Processed packets thread: out_22-pubsub 4258605

    pubsub/Processed packets thread: out_23-pubsub 4256402

    pubsub/Processed packets thread: out_24-pubsub 4257894

    pubsub/Processed packets thread: out_25-pubsub 4258782

    pubsub/Processed packets thread: out_26-pubsub 4253540

    pubsub/Processed packets thread: out_27-pubsub 4256721

    pubsub/Processed packets thread: out_28-pubsub 4258271

    pubsub/Processed packets thread: out_29-pubsub 4256795

    pubsub/Processed packets thread: out_30-pubsub 4254666

    pubsub/Processed packets thread: out_31-pubsub 4261489

    pubsub/Processed packets thread: out_32-pubsub 4260834

    pubsub/Processed packets thread: out_33-pubsub 4259946

    pubsub/Processed packets thread: out_34-pubsub 4255380

    pubsub/Processed packets thread: out_35-pubsub 4255495

    pubsub/Processed packets thread: out_36-pubsub 4257179

    pubsub/Processed packets thread: out_37-pubsub 4257184

    pubsub/Processed packets thread: out_38-pubsub 4256508

    pubsub/Processed packets thread: out_39-pubsub 4258539

    pubsub/Processed packets thread: out_40-pubsub 4256667

    pubsub/Processed packets thread: out_41-pubsub 4255784

    pubsub/Processed packets thread: out_42-pubsub 4260476

    pubsub/Processed packets thread: out_43-pubsub 4260423

    pubsub/Processed packets thread: out_44-pubsub 4257343

    pubsub/Processed packets thread: out_45-pubsub 4252923

    pubsub/Processed packets thread: out_46-pubsub 4256271

    pubsub/Processed packets thread: out_47-pubsub 4256780

  • Bartosz Małkowski commented 9 years ago

    I just checked code and it seems it looks good.

    I added some logs to check with what parameters LoadTest is started and how many items it published.

  • Eric Dziewa commented 9 years ago

    Here I ran the tests using 50, 100, 150, and 200 per second.

    Next I wanted to see max. I used 500, 1000, and 1500 per second.

    Here are the final stats from these last 3 tests.

    pubsub/Last second packets     0
    pubsub/Last minute packets     0
    pubsub/Last hour packets     0
    pubsub/Processing threads     32
    pubsub/Packets received     1280537
    pubsub/Packets sent     855172
    pubsub/In queue wait: SYSTEM     0
    pubsub/In queue wait: CLUSTER     0
    pubsub/In queue wait: HIGH     0
    pubsub/In queue wait: NORMAL     0
    pubsub/In queue wait: LOW     0
    pubsub/In queue wait: PRESENCE     0
    pubsub/In queue wait: LOWEST     0
    pubsub/Out queue wait: SYSTEM     0
    pubsub/Out queue wait: CLUSTER     0
    pubsub/Out queue wait: HIGH     0
    pubsub/Out queue wait: NORMAL     0
    pubsub/Out queue wait: LOW     0
    pubsub/Out queue wait: PRESENCE     0
    pubsub/Out queue wait: LOWEST     0
    pubsub/Total In queues wait     0
    pubsub/Total Out queues wait     0
    pubsub/Total queues wait     0
    pubsub/Max queue size     25664
    pubsub/IN Queue overflow     1238
    pubsub/OUT Queue overflow     0
    pubsub/Total queues overflow     1238
    pubsub/Average processing time on last 100 runs [ms]     0
    pubsub/IN_QUEUE processed messages     0
    pubsub/IN_QUEUE processed presences     1
    pubsub/IN_QUEUE processed cluster     0
    pubsub/IN_QUEUE processed other     1281611
    pubsub/IN_QUEUE processed IQ no XMLNS     0
    pubsub/IN_QUEUE processed IQ http://jabber.org/protocol/disco#items     8
    pubsub/IN_QUEUE processed IQ command     6
    pubsub/IN_QUEUE processed IQ pubsub     1281703
    pubsub/IN_QUEUE processed IQ http://jabber.org/protocol/disco#info     4
    pubsub/IN_QUEUE processed total IQ     1281721
    pubsub/OUT_QUEUE processed messages     0
    pubsub/OUT_QUEUE processed presences     0
    pubsub/OUT_QUEUE processed cluster     0
    pubsub/OUT_QUEUE processed other     854973
    pubsub/OUT_QUEUE processed IQ no XMLNS     0
    pubsub/OUT_QUEUE processed IQ http://jabber.org/protocol/disco#items     8
    pubsub/OUT_QUEUE processed IQ command     6
    pubsub/OUT_QUEUE processed IQ pubsub     855230
    pubsub/OUT_QUEUE processed IQ http://jabber.org/protocol/disco#info     4
    pubsub/OUT_QUEUE processed total IQ     855248
    pubsub/Processed packets thread: in_0-pubsub     39738
    pubsub/Processed packets thread: in_1-pubsub     40089
    pubsub/Processed packets thread: in_2-pubsub     39332
    pubsub/Processed packets thread: in_3-pubsub     40018
    pubsub/Processed packets thread: in_4-pubsub     39952
    pubsub/Processed packets thread: in_5-pubsub     40363
    pubsub/Processed packets thread: in_6-pubsub     40034
    pubsub/Processed packets thread: in_7-pubsub     39824
    pubsub/Processed packets thread: in_8-pubsub     40105
    pubsub/Processed packets thread: in_9-pubsub     40370
    pubsub/Processed packets thread: in_10-pubsub     40153
    pubsub/Processed packets thread: in_11-pubsub     40025
    pubsub/Processed packets thread: in_12-pubsub     39947
    pubsub/Processed packets thread: in_13-pubsub     40348
    pubsub/Processed packets thread: in_14-pubsub     39908
    pubsub/Processed packets thread: in_15-pubsub     39874
    pubsub/Processed packets thread: in_13-pubsub     40348
    pubsub/Processed packets thread: in_14-pubsub     39908
    pubsub/Processed packets thread: in_15-pubsub     39874
    pubsub/Processed packets thread: in_16-pubsub     40478
    pubsub/Processed packets thread: in_17-pubsub     40192
    pubsub/Processed packets thread: in_18-pubsub     40236
    pubsub/Processed packets thread: in_19-pubsub     40018
    pubsub/Processed packets thread: in_20-pubsub     39916
    pubsub/Processed packets thread: in_21-pubsub     39909
    pubsub/Processed packets thread: in_22-pubsub     40058
    pubsub/Processed packets thread: in_23-pubsub     40080
    pubsub/Processed packets thread: in_24-pubsub     39592
    pubsub/Processed packets thread: in_25-pubsub     40357
    pubsub/Processed packets thread: in_26-pubsub     39676
    pubsub/Processed packets thread: in_27-pubsub     40336
    pubsub/Processed packets thread: in_28-pubsub     40179
    pubsub/Processed packets thread: in_29-pubsub     40146
    pubsub/Processed packets thread: in_30-pubsub     40155
    pubsub/Processed packets thread: in_31-pubsub     40314
    pubsub/Processed packets thread: out_0-pubsub     26415
    pubsub/Processed packets thread: out_1-pubsub     26927
    pubsub/Processed packets thread: out_2-pubsub     26242
    pubsub/Processed packets thread: out_3-pubsub     26729
    pubsub/Processed packets thread: out_4-pubsub     26533
    pubsub/Processed packets thread: out_5-pubsub     26901
    pubsub/Processed packets thread: out_6-pubsub     26684
    pubsub/Processed packets thread: out_7-pubsub     26772
    pubsub/Processed packets thread: out_8-pubsub     26818
    pubsub/Processed packets thread: out_9-pubsub     26904
    pubsub/Processed packets thread: out_10-pubsub     26775
    pubsub/Processed packets thread: out_11-pubsub     26670
    pubsub/Processed packets thread: out_12-pubsub     26561
    pubsub/Processed packets thread: out_13-pubsub     27004
    pubsub/Processed packets thread: out_14-pubsub     26412
    pubsub/Processed packets thread: out_15-pubsub     26694
    pubsub/Processed packets thread: out_16-pubsub     26987
    pubsub/Processed packets thread: out_17-pubsub     26772
    pubsub/Processed packets thread: out_18-pubsub     26673
    pubsub/Processed packets thread: out_19-pubsub     26667
    pubsub/Processed packets thread: out_20-pubsub     26592
    pubsub/Processed packets thread: out_21-pubsub     26593
    pubsub/Processed packets thread: out_22-pubsub     26768
    pubsub/Processed packets thread: out_23-pubsub     26736
    pubsub/Processed packets thread: out_24-pubsub     26774
    pubsub/Processed packets thread: out_25-pubsub     26938
    pubsub/Processed packets thread: out_26-pubsub     26538
    pubsub/Processed packets thread: out_27-pubsub     26898
    pubsub/Processed packets thread: out_28-pubsub     26726
    pubsub/Processed packets thread: out_29-pubsub     26984
    pubsub/Processed packets thread: out_30-pubsub     26696
    pubsub/Processed packets thread: out_31-pubsub     26865
    
  • Eric Dziewa commented 9 years ago

    Here's the test again with tweaked init.properties. 500, 1000, and 1500.

  • Artur Hefczyc commented 8 years ago

    Form the comments I get it is completed now. I am closing the ticket and moving it to PubSub project.

issue 1 of 1
Type
Task
Priority
Blocker
Assignee
RedmineID
2289
Spent time
0
Issue Votes (0)
Watchers (0)
Reference
tigase/_server/tigase-pubsub#32
Please wait...
Page is in error, reload to recover