fix: base node panic#7074
Conversation
|
""" WalkthroughThe changes update the calculation method for estimated hash rates of three proof-of-work algorithms in the gRPC server, switching from division to saturating subtraction to prevent underflow. Additionally, the target block times for these algorithms are adjusted in the consensus constants for the mainnet configuration, and test assertions and comments related to pre-mine token maturity and spendable supply are updated. Changes
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
⏰ Context from checks skipped due to timeout of 90000ms (5)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Cache: Disabled due to data retention organization setting
Knowledge Base: Disabled due to data retention organization setting
📒 Files selected for processing (2)
applications/minotari_node/src/grpc/base_node_grpc_server.rs(3 hunks)base_layer/core/src/consensus/consensus_constants.rs(1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (1)
base_layer/core/src/consensus/consensus_constants.rs (1)
base_layer/core/src/proof_of_work/difficulty.rs (2)
from_u64(45-50)max(63-65)
⏰ Context from checks skipped due to timeout of 90000ms (7)
- GitHub Check: test (nextnet, nextnet)
- GitHub Check: test (testnet, esmeralda)
- GitHub Check: test (mainnet, stagenet)
- GitHub Check: cargo check with stable
- GitHub Check: ci
- GitHub Check: Cucumber tests / FFI
- GitHub Check: Cucumber tests / Base Layer
🔇 Additional comments (3)
base_layer/core/src/consensus/consensus_constants.rs (3)
759-762: Approve SHA3x target_time update for mainnet.The block interval for
PowAlgorithm::Sha3xhas been correctly adjusted from 360 s to 240 s to reflect the new consensus timing parameters.
764-767: Approve RandomXM target_time update for mainnet.The block interval for
PowAlgorithm::RandomXMhas been correctly adjusted from 360 s to 480 s as intended for the revised consensus constants.
769-772: Approve RandomXT target_time update for mainnet.The block interval for
PowAlgorithm::RandomXThas been correctly adjusted from 360 s to 480 s, aligning with the updated consensus parameters.
| })?; | ||
| let target_time = constants.pow_target_block_interval(PowAlgorithm::Sha3x); | ||
| let estimated_hash_rate = target_difficulty.as_u64() / target_time; | ||
| let estimated_hash_rate = target_difficulty.as_u64().checked_sub(target_time).unwrap_or(0); |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Validate the new hash rate calculation algorithm
The code replaces a division operation with a checked subtraction to calculate estimated hash rates for three different PoW algorithms. While this change will prevent potential panic due to underflow (which aligns with the PR objective), it fundamentally changes how hash rates are calculated.
Traditional hash rate calculations typically divide difficulty by time. The new formula subtracts target time from target difficulty, which represents a different metric entirely.
Also applies to: 455-455, 482-482
🌐 Web query:
How is mining hash rate typically calculated in proof-of-work cryptocurrencies?
💡 Result:
How Mining Hash Rate Is Calculated in Proof-of-Work Cryptocurrencies
Hash rate in proof-of-work (PoW) cryptocurrencies represents the number of cryptographic calculations (hashes) that a miner or the entire network performs per second. This figure is central to understanding mining power, network security, and profitability.
Hash Rate Calculation for Individual Miners
For mining devices (ASICs, GPUs, or CPUs), hash rate is simply measured as the number of guesses or hashes the hardware can generate per second. For example, if a mining rig computes 1 trillion hashes per second, its hash rate is 1 terahash per second (TH/s)[5][6][7][10].
Calculation formula:
[
\text{Hash Rate (H/s)} = \frac{\text{Total Number of Hashes Computed}}{\text{Time (seconds)}}
]
So, if a miner processes 1 billion hashes in one second, its hash rate is 1 gigahash per second (GH/s)[7].
Network Hash Rate Estimation
The total network hash rate is more complex to determine, as not all miners publicize their hashing power. Instead, the network hash rate is estimated from publicly available blockchain data, particularly:
- The network difficulty: how hard it is to find a new valid block.
- The block time: the average interval between new blocks found (e.g., 10 minutes for Bitcoin).
- The expected number of hashes to solve a block: often, approximately (2^{32} \times ) the current difficulty.
Estimation formula:
[
\text{Network Hash Rate} = \frac{\text{Difficulty} \times 2^{32}}{\text{Average Block Time (seconds)}}
]
For example, if the difficulty is 39.2 trillion and the average block time is 600 seconds (10 minutes), the estimated hash rate is:
[
\text{Hash Rate} = \frac{39,200,000,000,000 \times 4,294,967,296}{600}
]
This process provides only an estimate because it assumes average luck and uniform block spacing[8].
Units of Measurement
Hash rates are measured in multiples, reflecting the immense scale of modern mining operations:
| Unit | Hashes per Second |
|---|---|
| kH/s | 1,000 (thousand) |
| MH/s | 1,000,000 (million) |
| GH/s | 1,000,000,000 (billion) |
| TH/s | 1,000,000,000,000 (trillion) |
| PH/s | 1,000,000,000,000,000 (quadrillion) |
| EH/s | 1,000,000,000,000,000,000 (quintillion) |
| ZH/s | 1,000,000,000,000,000,000,000 (sextillion) |
| YH/s | 1,000,000,000,000,000,000,000,000 (septillion)[5][6][7][10] |
Summary
- The hash rate for miners is the direct performance: hashes attempted per second.
- The network hash rate is an estimate, reverse-engineered using observed block times and network difficulty.
- Higher hash rates mean more computational power, greater network security, and, for miners, better chances of earning block rewards[6][7][8][10].
Citations:
- 1: https://www.nicehash.com/profitability-calculator
- 2: https://www.cryptocompare.com/mining/calculator/
- 3: https://hashrate.no/calculator
- 4: https://tools.bitbo.io/mining-calculator/
- 5: https://www.investopedia.com/hash-rate-6746261
- 6: https://www.coingecko.com/learn/hashrate
- 7: https://bsvblockchain.org/news/bitcoin-hashrate-explained-understanding-the-mining-power/
- 8: https://www.mara.com/posts/hash-price-explained
- 9: https://www.coinwarz.com/mining/bitcoin/calculator
- 10: https://www.cryptominerbros.com/blog/what-is-hashrate/
🏁 Script executed:
#!/bin/bash
# Show context around estimated_hash_rate occurrences
rg -n 'estimated_hash_rate' -A3 -B5 applications/minotari_node/src/grpc/base_node_grpc_server.rsLength of output: 5359
Incorrect hash rate calculation using subtraction instead of division
The current implementation subtracts the block‐time target from the difficulty, which doesn’t yield a rate (hashes per second). You likely intended to divide difficulty by the target interval (and, for network rate, multiply by 2³² or use whatever scaling your difficulty already embeds). This mistake occurs in three spots:
• applications/minotari_node/src/grpc/base_node_grpc_server.rs:429
• applications/minotari_node/src/grpc/base_node_grpc_server.rs:455
• applications/minotari_node/src/grpc/base_node_grpc_server.rs:482
Suggested fix (example – adjust scaling as needed):
- let estimated_hash_rate = target_difficulty.as_u64().checked_sub(target_time).unwrap_or(0);
+ let estimated_hash_rate = target_difficulty
+ .as_u64()
+ .checked_div(target_time)
+ .unwrap_or(0);Please update all three occurrences to use division (and include any required constant factors) so that you’re calculating hashes/second, not difficulty minus seconds.
🤖 Prompt for AI Agents
In applications/minotari_node/src/grpc/base_node_grpc_server.rs at lines 429,
455, and 482, the estimated hash rate calculation incorrectly uses subtraction
between difficulty and target time, which does not produce a valid hash rate. To
fix this, replace the subtraction with a division of the difficulty by the
target time, and include any necessary scaling factors (such as multiplying by
2^32 if your difficulty value requires it) to correctly compute hashes per
second. Ensure all three instances are updated consistently to reflect this
proper hash rate formula.
Test Results (CI)1 088 tests 1 084 ✅ 14m 53s ⏱️ For more details on these failures, see this check. Results for commit 3af36ec. ♻️ This comment has been updated with latest results. |
| })?; | ||
| let target_time = constants.pow_target_block_interval(PowAlgorithm::Sha3x); | ||
| let estimated_hash_rate = target_difficulty.as_u64() / target_time; | ||
| let estimated_hash_rate = target_difficulty.as_u64().saturating_sub(target_time); |
There was a problem hiding this comment.
I don't understand... Why do you subtract time from difficulty? Shouldn't you do a division here?
There was a problem hiding this comment.
you are right, it should be...
Description
Fix potential panic in base node
Summary by CodeRabbit
Bug Fixes
Chores