Skip to main content

Klustron 1.3 Performance Test Report

KlustronAbout 7 min

Klustron 1.3 Performance Test Report

Version: v1.3.1

Cluster Topology and Configuration:

Cluster TopologyCompute NodeStorage NodeManagement Nodehaproxysysbenchbenchmarksql
192.168.0.20
192.168.0.21
192.168.0.22

Cluster Description: Compute Nodes: Three machines, each deploying a compute node. Storage Nodes: Three shards, each shard with a single primary; the single primaries of the three shards are distributed across these three machines. Management Nodes: The cluster management consists of three machines, forming three nodes, with one primary and two backups.

Machine Configuration: CentOS 8.5 32c 128g 1.9T NVMe SSD 10GbE network card.

Load Balancing: haproxy 2.5.0

sysbench:1.0.20

benchmarksql:5.0

Pre-load Test Preparation:

Create a cluster with 3 shards and 3 compute nodes.

Modifications to compute node system variables before load testing:

alter system set statement_timeout=6000000;
alter system set mysql_read_timeout=1200;
alter system set mysql_write_timeout=1200;
alter system set lock_timeout=1200000;
alter system set log_min_duration_statement=1200000;
alter system set effective_cache_size = '8GB';
alter system set work_mem  = '128MB';
alter system set wal_buffers='64MB';
alter system set autovacuum=false;

Note: Modifications to each node require a restart to take effect.

Modifications to storage node system variables before load testing:

mysql -h xxx -P xxx -upgx -ppgx_pwd  #Log into the master of each shard to make modifications
set global innodb_buffer_pool_size=32*1024*1024*1024;
set global lock_wait_timeout=1200;
set global innodb_lock_wait_timeout=1200;    
set global fullsync_timeout=1200000; 
set global enable_fullsync=false;
set global innodb_flush_log_at_trx_commit=2;
set global sync_binlog=0;
set global max_binlog_size=1*1024*1024*1024;
set global enable_fullsync=off;

Set each shard to avoid switchover through XPanel by navigating to [Cluster MGT] -> [Cluster Switch Free Settings]

Remove the backup machines from each shard.

Sysbench

oltp_point_select

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)0.812.8664.4770.55
TPS113007.3795306.5273943.3166162.5
QPS113007.3795306.5273943.3166162.5
CPU (32vC)20:29%
21:27%
22:27%
20:28%
21:26%
22:27%
20:27%
21:26%
22:26%
20:27%
21:25%
22:26%
Memory (128G)20:33%
21:33%
22:33%
20:33%
21:33%
22:33%
20:33%
21:33%
22:33%
20:34%
21:34%
22:34%
IO Utilization20:7%
21:7%
22:7%
20:7%
21:5%
22:4%
20:5%
21:3%
22:3%
20:6%
21:7%
22:4%

oltp_update_non_index

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)1.4412.351.0258.92
TPS66057.7963286.7754899.4351132.19
QPS66057.7963286.7754899.4351132.19
CPU (32vC)20:34%
21:32%
22:36%
20:31%
21:33%
22:36%
20:33%
21:30%
22:35%
20:31%
21:32%
22:33%
Memory (128G)20:34%
21:34%
22:34%
20:34%
21:34%
22:34%
20:34%
21:34%
22:34%
20:35%
21:35%
22:35%
IO Utilization20:27%
21:18%
22:39%
20:99%
21:43%
22:95%
20:95%
21:99%
22:95%
20:94%
21:91%
22:96%

oltp_update_index

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)2.4311.2446.6355.82
TPS64748.6354121.3646875.1646347.41
QPS64748.6354121.3646875.1646347.41
CPU (32vC)20:40%
21:42%
22:40%
20:33%
21:32%
22:29%
20:33%
21:28%
22:28%
20:32%
21:26%
22:34%
Memory (128G)20:20%
21:21%
22:18%
20:20%
21:22%
22:19%
20:21%
21:23%
22:23%
20:21%
21:23%
22:21%
IO Utilization20:92%
21:97%
22:98%
20:99%
21:91%
22:94%
20:96%
21:94%
22:96%
20:93%
21:92%
22:97%

oltp_read_write

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)186.54411.96612.21427.07
TPS642.611940.163095.13218.29
QPS2570.437760.6412380.3812869.15
CPU (32vC)20:11%
21:10%
22:12%
20:20%
21:16% 22:21%
20:23%
21:22%
22:25%
20:25%
21:24%
22:26%
Memory (128G)20:35%
21:35%
22:35%
20:36%
21:36%
22:36%
20:37%
21:37%
22:37%
20:38%
21:38%
22:38%
IO Utilization20:93%
21:98%
22:98%
20:60%
21:13%
22:51%
20:52%
21:54%
22:51%
20:63%
21:57%
22:61%

oltp_read_only

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)183.21502.2383.33427.07
TPS644.99865.4230863450.44
QPS2579.963461.6712334.1813783.17
CPU (32vC)20:11%
21:11%
22:12%
20:29%
21:27%
22:27%
20:28%
21:27%
22:26%
20:28%
21:27%
22:26%
Memory (128G)20:34%
21:34%
22:34%
20:33%
21:33%
22:33%
20:33%
21:33%
22:33%
20:33%
21:33%
22:33%
IO Utilization20:100%
21:100%
22:100%
20:55%
21:60%
22:58%
20:65%
21:70%
22:68%
20:75%
21:71%
22:68%

oltp_write_only

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)183.21260.72459.18637.08
TPS651.22433.68264.01198.25
QPS2604.91812.3896.45503.76
CPU (32vC)20:4%
21:4%
22:14%
20:5%
21:3%
22:10%
20:6%
21:8%
22:9%
20:6%
21:7%
22:8%
Memory (128G)20:34%
21:34%
22:34%
20:35%
21:34%
22:34%
20:35%
21:34%
22:34%
20:36%
21:34%
22:34%
IO Utilization20:100%
21:99%
22:100%
20:100%
21:100%
22:100%
20:100%
21:100%
22:100%
20:100%
21:100%
22:100%

oltp_insert

Load Test Duration5min5min5min5min
Concurrent Users100300600900
95% Latency (ms)0.877.8427.6643.39
TPS110055.3298261.5375309.9677354.33
QPS110055.3298261.5375309.9677354.33
CPU (32vC)20:34%
21:26%
22:27%
20:33%
21:22%
22:29%
20:29%
21:27%
22:24%
20:25%
21:23%
22:38%
Memory (128G)20:34%
21:34%
22:34%
20:34%
21:34%
22:34%
20:34%
21:34%
22:34%
20:35%
21:35%
22:35%
IO Utilization20:56%
21:58%
22:64%
20:94%
21:47%
22:93%
20:91%
21:85%
22:93%
20:94%
21:96%
22:94%

TPC-C

Load Test Duration10min10min10min10min10min10min10min10min10min10min10min10min
warehouse500500500500500500500500500500500500
Concurrent Users505060708090100150200300400500
tpmC (Transactions per Minute of type C)86851.5386653.6384991.9884124.681586.1883623.8646545.8232384.5121039.7821356.6221970.1322568.62
tmpTotal193198.13192866.59188799.55186880.09181188.01185844.57103319.4571928.4446754.4947422.6248836.3950230.18
Notesnode:18,19,20node:20,21,22
CPU (32vC)18:40% 19:39% 20:36%20:35% 21:33% 22:33%20:38% 21:35% 22:36%20:37% 21:32% 22:36%20:37% 21:34% 22:36%20:36% 21:29% 22:35%20:32% 21:30% 22:33%20:11% 21:31% 22:29%20:26% 21:9% 22:8%20:27% 21:8% 22:8%20:25% 21:7% 22:8%20:27% 21:19% 22:45%
Memory (128G)18:25% 19:20% 20:20%20:23% 21:20% 22:21%20:23% 21:22% 22:22%20:24% 21:22% 22:23%20:25% 21:22% 22:23%20:26% 21:22% 22:23%20:26% 21:23% 22:24%20:27% 21:24% 22:25%20:27% 21:24% 22:25%20:27% 21:24% 22:26%20:28% 21:24% 22:26%20:28% 21:24% 22:26%
IO Utilization18:70% 19:75% 20:72%20:65% 21:67% 22:62%20:73% 21:67% 22:72%20:78% 21:75% 22:71%20:62% 21:65% 22:66%20:82% 21:83% 22:85%20:81% 21:89% 22:89%20:30% 21:35% 22:55%20:28% 21:36% 22:44%20:22% 21:32% 22:21%20:25% 21:24% 22:23%20:25% 21:24% 22:31%

TPC-H

queriescost (seconds)resulterror
Q115.8succ
Q21.36succ
Q31598.1succ
Q43.12succ
Q530.23succ
Q62.6succ
Q72262.64succ
Q85.3succ
Q914.33succ
Q105.15succ
Q110.88succ
Q123.77succ
Q132.54succ
Q142.79succ
Q155.36succ
Q160.88succ
Q1710.97succ
Q1813.9succ
Q193.14succ
Q204.28succ
Q219.64succ
Q220.71succ

TPC-DS

totalCost: 2986.81s

querycost (seconds)resulterror
Q10.24succ
Q24.84succ
Q31.51succ
Q430.63succ
Q55.18succ
Q6141.67succ
Q75.73succ
Q82.1succ
Q911.49succ
Q106.03succ
Q1120.22succ
Q120.52succ
Q132.21succ
Q149.95succ
Q151.08succ
Q160.75succ
Q176.73succ
Q184.26succ
Q191.77succ
Q201.03succ
Q215.88succ
Q2213.06succ
Q2320.25succ
Q244.44succ
Q251292.55succ
Q263.52succ
Q273.47succ
Q287.56succ
Q292.66succ
Q300.35succ
Q3119.19succ
Q322.14succ
Q333.07succ
Q340.09succ
Q355.05succ
Q360.07succ
Q370.04succ
Q384.56succ
Q3915.45succ
Q401.38succ
Q410.05succ
Q421.6succ
Q430.06succ
Q441.1succ
Q451029.89succ
Q460.07succ
Q476.47succ
Q482.05succ
Q493.12succ
Q504.58succ
Q514.21succ
Q521.52succ
Q531.61succ
Q540.85succ
Q551.59succ
Q563.08succ
Q572.86succ
Q589.27succ
Q596.34succ
Q603.1succ
Q610.14succ
Q621succ
Q631.63succ
Q6411.11succ
Q653.68succ
Q661.37succ
Q6710.05succ
Q680.09succ
Q695.24succ
Q705.04succ
Q711.62succ
Q7228.57succ
Q730.09succ
Q747.49succ
Q755.74succ
Q761.54succ
Q774.75succ
Q7825.25succ
Q792.49succ
Q806.69succ
Q810.33succ
Q825.95succ
Q831.2succ
Q8419.2succ
Q852.63succ
Q860.73succ
Q874.54succ
Q8810.27succ
Q891.85succ
Q900.79succ
Q911.12succ
Q921.1succ
Q933.59succ
Q940.52succ
Q9532.88succ
Q961.25succ
Q973.23succ
Q981.83succ
Q992.03succ

END