ECIP 1100: MESS (Modified Exponential Subjective Scoring) Source

TypeStandards Track


Define a function arbitrating chain acceptance using relative total difficulty and common ancestor time to raise finality confidence.

Update (January 2024): Replaced with ECBP-1110.


A low hashrate has caused Ethereum Classic’s consensus algorithms to yield inconvenient and undesirable finality rates.

This proposal offers a way to increase the finality rate without tampering with existing “hard” chain consensus functions or characteristics, and to do so with minimal negative side effects.



This proposal is built on a proposed core principle to Ethereum Classic neatly summed as:

Small reorgs are normal and healthy; large reorgs are suspicious. Given a reorg of unusual length, nodes should value their local (first-available) segment with preference over the later proposed segment, despite that it may have greater difficulty.Source

What follows is an algorithm and implementation details toward a way to realize this opinion programmatically as convention for network clients.


Conceptual translation

  • Small reorgs are normal and healthy; large reorgs are suspicious.

Conceptions of “small” and “large” are described functionally by the “suspicious” curve; a reorg becomes “big” when it becomes suspicious, and vice versa. This is parameterized by time and total difficulty. Generally, anything less than about 10 minutes won’t be suspicious, and thus is considered “small.”

  • nodes should value their local (first-available) segment with preference over the later proposed segment

Valuation is conventionalized as the difference between treating a block as a canonical head versus treating it like a side chain. In both cases the block data is stored and may be propagated. Miners normally only mine on top of blocks they consider to be canonical.

Algorithm details

This specification for subjective arbitration of chain segments based on availability cardinality, segment duration, and difficulty is provided which may be implemented by nodes to deter and avoid suspicious reorganizations.

This specification falls outside of existing consensus protocol. With that, the need for congruent implementation(s) across protocol providers (aka clients) on the network is important to avoid observable or actionable variances in implementation that might present a risk to network coherence (whether by malicious intention or accident).

This specification should be applied in the client wherever total difficulty is used to evaluate a block’s candidacy for the head of the chain database. In etclabscore/core-geth this occurs in the BlockChain.writeBlockWithState and its caller BlockChain.insertChain methods.

This specification should be applied only given the condition of a positive outcome for the existing subjective arbitration cases described below.

In the case of a negative result from the arbitration, blocks should be deferred from receiving status as canonical, but still be stored as a sidechain. This, as opposed to outright rejection for failing blocks, permits segments which have not but may eventually achieve sufficient difficulty to maintain practical eligibility for canonical status.

Existing Subjective Arbitrations

This specification assumes existing extra-protocol behavior as:

  • preference of self-mined blocks for proposed blocks having equal total difficulty and number
  • then, a 50% randomized acceptance rate for proposed blocks having equal total difficulty and number, per Eyal and Sirer

These evaluations are implemented at the etclabscore/core-geth and ethereum/go-ethereum clients as:

	// If the total difficulty is higher than our known, add it to the canonical chain
	// Second clause in the if statement reduces the vulnerability to selfish mining.
	// Please refer to
	reorg := externTd.Cmp(localTd) > 0
	currentBlock = bc.CurrentBlock()
	if !reorg && externTd.Cmp(localTd) == 0 {
		// Split same-difficulty blocks by number, then preferentially select
		// the block generated by the local miner as the canonical block.
		if block.NumberU64() < currentBlock.NumberU64() {
			reorg = true
		} else if block.NumberU64() == currentBlock.NumberU64() {
			var currentPreserve, blockPreserve bool
			if bc.shouldPreserve != nil {
				currentPreserve, blockPreserve = bc.shouldPreserve(currentBlock), bc.shouldPreserve(block)
			reorg = !currentPreserve && (blockPreserve || mrand.Float64() < 0.5)

// isLocalBlock checks whether the specified block is mined
// by local miner accounts.
// We regard two types of accounts as local miner account: etherbase
// and accounts specified via `txpool.locals` flag.
func (s *Ethereum) isLocalBlock(block *types.Block) bool {
	author, err := s.engine.Author(block.Header())
	if err != nil {
		log.Warn("Failed to retrieve block author", "number", block.NumberU64(), "hash", block.Hash(), "err", err)
		return false
	// Check whether the given address is etherbase.
	etherbase := s.etherbase
	if author == etherbase {
		return true
	// Check whether the given address is specified by `txpool.local`
	// CLI flag.
	for _, account := range s.config.TxPool.Locals {
		if account == author {
			return true
	return false

// shouldPreserve checks whether we should preserve the given block
// during the chain reorg depending on whether the author of block
// is a local account.
func (s *Ethereum) shouldPreserve(block *types.Block) bool {
	// The reason we need to disable the self-reorg preserving for clique
	// is it can be probable to introduce a deadlock.
	// e.g. If there are 7 available signers
	// r1   A
	// r2     B
	// r3       C
	// r4         D
	// r5   A      [X] F G
	// r6    [X]
	// In the round5, the inturn signer E is offline, so the worst case
	// is A, F and G sign the block of round5 and reject the block of opponents
	// and in the round6, the last available signer B is offline, the whole
	// network is stuck.
	if _, ok := s.engine.(*clique.Clique); ok {
		return false
	return s.isLocalBlock(block)

Proposed Additional Subjective Arbitration

As a successor to established chain reorganization arbitration, the following logic should be added.

  • A polynomial function ecbp1100PolynomialV is defined which implements a cubic curve as the “antigravity” of a proposed chain segment.
  • A condition function ecbp11100 applies this value as a required total difficulty ratio for proposed chain segments over their local alternative.

Polynomial ecbp1100PolynomialV


def get_curve_function_numerator(time_delta: int) -> int:
    xcap = 25132 # = floor(8000*pi)
    ampl = 15
    height = CURVE_FUNCTION_DENOMINATOR * (ampl * 2)
    x = min(time_delta, xcap)
    # The sine approximator `y = 3*x**2 - 2*x**3` rescaled to the desired height and width
    return CURVE_FUNCTION_DENOMINATOR + (3 * x**2 - 2 * x**3 // xcap) * height // xcap ** 2


Condition ecbp1100

if proposed_subchain_td * CURVE_FUNCTION_DENOMINATOR < get_curve_function_numerator(current.Time - commonAncestor.Time) * local_subchain_td:
    return notok
return ok


This is implemented in the etclabscore/core-geth Go(lang) client as follows.

// ecbp1100 implements the "MESS" artificial finality mechanism
// "Modified Exponential Subjective Scoring" used to prefer known chain segments
// over later-to-come counterparts, especially proposed segments stretching far into the past.
func (bc *BlockChain) ecbp1100(commonAncestor, current, proposed *types.Header) error {

	// Get the total difficulties of the proposed chain segment and the existing one.
	commonAncestorTD := bc.GetTd(commonAncestor.Hash(), commonAncestor.Number.Uint64())
	proposedParentTD := bc.GetTd(proposed.ParentHash, proposed.Number.Uint64()-1)
	proposedTD := new(big.Int).Add(proposed.Difficulty, proposedParentTD)
	localTD := bc.GetTd(current.Hash(), current.Number.Uint64())

	// if proposed_subchain_td * CURVE_FUNCTION_DENOMINATOR < get_curve_function_numerator(proposed.Time - commonAncestor.Time) * local_subchain_td.
	proposedSubchainTD := new(big.Int).Sub(proposedTD, commonAncestorTD)
	localSubchainTD := new(big.Int).Sub(localTD, commonAncestorTD)

	xBig := big.NewInt(int64(current.Time - commonAncestor.Time))
	eq := ecbp1100PolynomialV(xBig)
	want := eq.Mul(eq, localSubchainTD)

	got := new(big.Int).Mul(proposedSubchainTD, ecbp1100PolynomialVCurveFunctionDenominator)

	if got.Cmp(want) < 0 {
		prettyRatio, _ := new(big.Float).Quo(
		return fmt.Errorf(`%w: ECBP1100-MESS 🔒 status=rejected age=%v current.span=%v proposed.span=%v tdr/gravity=%0.6f common.bno=%d common.hash=%s current.bno=%d current.hash=%s proposed.bno=%d proposed.hash=%s`,
			common.PrettyAge(time.Unix(int64(commonAncestor.Time), 0)),
			commonAncestor.Number.Uint64(), commonAncestor.Hash().Hex(),
			current.Number.Uint64(), current.Hash().Hex(),
			proposed.Number.Uint64(), proposed.Hash().Hex(),
	return nil

ecbp1100PolynomialV is a cubic function that looks a lot like Option 3's sin function,
but adds the benefit that the calculation can be done with integers (instead of yucky floating points).


def get_curve_function_numerator(time_delta: int) -> int:
    xcap = 25132 # = floor(8000*pi)
    ampl = 15
    height = CURVE_FUNCTION_DENOMINATOR * (ampl * 2)
    if x > xcap:
        x = xcap
    # The sine approximator `y = 3*x**2 - 2*x**3` rescaled to the desired height and width
    return CURVE_FUNCTION_DENOMINATOR + (3 * x**2 - 2 * x**3 // xcap) * height // xcap ** 2

The if tdRatio < antiGravity check would then be

if proposed_subchain_td * CURVE_FUNCTION_DENOMINATOR < get_curve_function_numerator(current.Time - commonAncestor.Time) * local_subchain_td.
func ecbp1100PolynomialV(x *big.Int) *big.Int {

	// Make a copy; do not mutate argument value.

	// if x > xcap:
	//    x = xcap
	xA := new(big.Int).Set(x)
	if xA.Cmp(ecbp1100PolynomialVXCap) > 0 {

	xB := new(big.Int).Set(x)
	if xB.Cmp(ecbp1100PolynomialVXCap) > 0 {

	out := big.NewInt(0)

	// 3 * x**2
	xA.Exp(xA, big2, nil)
	xA.Mul(xA, big3)

	// 3 * x**2 // xcap
	xB.Exp(xB, big3, nil)
	xB.Mul(xB, big2)
	xB.Div(xB, ecbp1100PolynomialVXCap)

	// (3 * x**2 - 2 * x**3 // xcap)
	out.Sub(xA, xB)

	// // (3 * x**2 - 2 * x**3 // xcap) * height
	out.Mul(out, ecbp1100PolynomialVHeight)

	// xcap ** 2
	xcap2 := new(big.Int).Exp(ecbp1100PolynomialVXCap, big2, nil)

	// (3 * x**2 - 2 * x**3 // xcap) * height // xcap ** 2
	out.Div(out, xcap2)

	// CURVE_FUNCTION_DENOMINATOR + (3 * x**2 - 2 * x**3 // xcap) * height // xcap ** 2
	out.Add(out, ecbp1100PolynomialVCurveFunctionDenominator)
	return out

var big2 = big.NewInt(2)
var big3 = big.NewInt(3)

// ecbp1100PolynomialVCurveFunctionDenominator
var ecbp1100PolynomialVCurveFunctionDenominator = big.NewInt(128)

// ecbp1100PolynomialVXCap
// xcap = 25132 # = floor(8000*pi)
var ecbp1100PolynomialVXCap = big.NewInt(25132)

// ecbp1100PolynomialVAmpl
// ampl = 15
var ecbp1100PolynomialVAmpl = big.NewInt(15)

// ecbp1100PolynomialVHeight
// height = CURVE_FUNCTION_DENOMINATOR * (ampl * 2)
var ecbp1100PolynomialVHeight = new(big.Int).Mul(new(big.Int).Mul(ecbp1100PolynomialVCurveFunctionDenominator, ecbp1100PolynomialVAmpl), big2)


This is a modified (M) version of Buterin’s Exponential Subjective Scoring (ESS) by

  • using a capped polynomial function instead of an unbounded exponential function
  • using the difference of local head time(stamp) from common ancestor time(stamp), rather than the previously described block lengths or times of block reception used by Buterin.

See References for what I’ve found on the topic.

The Case and Place of Subjectivity

This specification maintains the (modified) GHOST protocol as an invariant; existing consensus rules are not modified nor sidestepped. Modified only is the schedule (ie “pace”, “timeline”) at which clients choose to implement these established procedures. The heaviest (most difficult) chain will – still – always, eventually, win. Proposed only is to make clients somewhat stubborn in their opinion and processing of chains; it makes them sluggish and resistant to big changes. It gives them a (temporary) opinion.

Opinions are allowed under GHOST and the rest of Ethereum’s Yellow Paper (and Satoshi’s email chains). Nowhere is it specified that blocks must be imported or included immediately, nor that a miner must mine on the heaviest chain available to them, nor that submitted transactions must be processed, nor that blocks must be propagated regularly.

The normal functioning of the chain is explained (as in rationalized) by some game theory and economics, but it is not subject to it. Miners are not forced (as in caused by the protocol) to mine on the heaviest chain available to them; they normally do so because they bet that that will turn out to be profitable for them. But sometimes mining on the heaviest chain may not be profitable for them; like if the heaviest chain was apparently made by a criminal, and perpetuating that association may lower the exchange rate of their reward. Miners can mine on whatever chain they want, can include whatever transactions they want, and can process, propagate, postpone, or ignore blocks however they want; all without offending Satoshi or his idolators.

In consideration here is a proposal of CONVENTION for network participants that is designed to keep the network as unified as possible while describing an opinion (implemented as algorithm) that prevents big ugly late-coming chain segments from being immediately and automatically accepted.

This is functionally no different than a thought experiment where the human miners are watching their nodes day-in and day-out, every minute, arbitrating with their own personal opinions about whether or not to allow geth to mine on top of block 10550798. Again, they are allowed to do this. They can do this. Sometimes they do do this. This proposal is a specification of a way that they can make this same order of decisions, but with the help of some math, a computer, and heuristics that will allow them to do it in coordination without requiring the use of the world’s most boring conference call.

In fact, one of the first evolutions of this proposal was made by @BelfordZ:

When geth wants to reorganize its existing chain for a too-long or too-old chain segment, have it just send an email to its operator saying: Geth has found a suspicious chain segment and requires a human’s wisdom and advice… and then turn off.

Except for a few fancy bells and whistles (ie maybe not shutting down… :)), and a proposed CONVENTION for determining suspicion, these proposals are more alike than different.

Network participants are allowed to be stubborn curmudgeons with opinions. ECIP1100 wants to help them do that gracefully.


  • Mini-forks and normal reorganizations are not effected, since their difficulty variances fall within the curve in this domain (200 seconds, give or take).
  • There is no Edge-of-Eternity attack vector (no vulnerable focal points for an attacker to target).
  • Partitions resolve quickly and consistently.
  • Intentional partitions are extremely difficult to establish and maintain.
  • The polynomial function yields a “ceiling” that would be extravagantly high for an attacker to achieve relative to the main network, but within reasonable operating bounds for the victim of an eclipse attack to eventually recover and rejoin the network. Unbounded exponential growth for the antigravity function serves no purpose beyond some point.

The graphs below show a 200 block chain accepting, sidechaining or rejecting reorganizations of varying relative difficulty and length.




  • Nodes subject to eclipse attacks (mostly considered as nodes coming online after a long time or starting out) are vulnerable to destitution, even once released. This is addressed by the function’s ceiling causing the attacker to need to establish (and maintain) a total difficulty 1/31x of the competing honest chain, and can be addressed further by the operator including a checkpoint value.
  • It may be anticipated that the network uncle and waste rates will rise slightly, as blocks that would otherwise be randomly included will be rejected. ETC currently has a 3.5% uncle rate compared to ETH’s 5.5%.
  • A network vulnerability resulting in bifurcation exists given the condition that competing segments become available with near-equal total difficulty within the window of the antigravity curve allowance. This state of balance must be maintained until the antigravities force the network partitions to reject each other’s segments. If a state of near-perfect balance in total difficulty between the partitions can be maintained, this bifurcated state shall be indefinite. However, achieving and maintaining a balanced competition between segments can be seen to be extraordinarily challenging, expensive, and ultimately unlikely. Confounding variables for an attacker in this scenario are normal network hashrate variability, network propagation times and protocol, disproportionate mining entity hashpower share, unpredictable block times, and existing subjective arbitration steps.

Discussion of Constant Parameters

The polynomial function uses constant parameters xcap = 2pi/8000 and amplitude = 15, the values of which are reasoned as follows.

The x cap value of 2pi/8000 causes the peak of the curve (and ultimate ceiling) to occur at 25132 seconds (approximately 7 hours). This falls in between the rates of the previously considered exponential functions. The “ramp up” domain (nearest x=0) sees a flattened curve, yielding a more generous lenience for competing short segments. The curve eventually intersects the original exponential function at about 900 seconds (15 minutes) at about y=1.09.

The amplitude value of 15 causes the peak to occur at (2*15)+1 = 31. This value means that the maximum “antigravity” an attack will face is a 31, where the proposed chain would need a total difficulty 31 times that of the chain it would replace.

These values were chosen for ETC with the following assumptions and reasoning.

  • Assume global Ethash hashrate availability is 200TH.
  • Assume greatest single Ethash mining entity is 33% of the global, yielding about 66TH. This is considered as the largest possible antagonist for the ETC chain.
  • Assume ETC has 3TH total contributed hashrate.
  • Following this, we deduce that the largest Ethash mining entity has 22 times of ETC’s current mining power. An “attack” by this entity on ETC would result in a (66/(66+3))*100 = 95% attack.
  • Given a 22x anticipated “reasonable” worst-case scenario, the amplitude of 15 yielding a total 31 ceiling, around 50% above the anticipated worst-case scenario, is intended to be sufficiently future-proof and resilient to unforeseen scenarios.

Alternatives Considered

An bounded exponential function would work in much the same way, although it would not have a continuous ceiling transition and would, in a far-edge case, present an exploitable focal point of vulnerability at that transition.


This feature does not require a hard fork, but the network stands to benefit and avoid risk with majority coordinated acivation.


  • Feature is tentatively proposed to activate in Core-Geth on Ethereum Classic network at and above block 11_380_000 (ETA 09 October 2020).
  • Feature is proposed to activate in Core-Geth on Ethereum Classic’s Mordor test network at and above block 2380000 (ETA 29 September 2020).
  • Core-Geth feature implementation includes a few additional safety mechanisms:
    • MESS is disabled for any sync mode besides full sync.
    • MESS is only enabled once a peer has completed initial chain synchronisation, not while they are fast syncing or even full syncing during the download and process phase. This reduces the chances of a node coming online being lured into an eclipse scenario.
    • MESS is only enabled if a peer has greater than or equal to the MinimumSyncPeers peers. In Core-Geth this value is by default 5.
    • MESS is disabled if, once synced, a node’s head block is not changed within a time limit (ie becomes stale). In Core-Geth this value is by default 30 * 13 seconds.

The associated Core-Geth implementation is available here.


Cross client tests are included as assets as “first” and “second” RLP-encoded chain segments. A genesis configuration is provided describing the necessary chain and genesis configuration.

The file names describe the test, as well as providing expectations for the outcome of the second chain import via the suffix secondWins-true vs. secondWins-false.



Bespoke previously considered exponential functions:

ecbp1100AGSinusoidalA is a sinusoidal function.

OPTION 3: Yet slower takeoff, yet steeper eventual ascent. Has a differentiable ceiling transition.
h(x)=15 sin((x+12000 π)/(8000))+15+1

func ecbp1100AGSinusoidalA(x float64) (antiGravity float64) {
	ampl := float64(15)   // amplitude
	pDiv := float64(8000) // period divisor
	phaseShift := math.Pi * (pDiv * 1.5)
	peakX := math.Pi * pDiv // x value of first sin peak where x > 0
	if x > peakX {
		// Cause the x value to limit to the x value of the first peak of the sin wave (ceiling).
		x = peakX
	return (ampl * math.Sin((x+phaseShift)/pDiv)) + ampl + 1
ecbp1100AGExpB is an exponential function with x as a base (and rationalized exponent).

OPTION 2: Slightly slower takeoff, steeper eventual ascent
func ecbp1100AGExpB(x float64) (antiGravity float64) {
	return math.Pow(x, x*0.00002)
ecbp1100AGExpA is an exponential function with x as exponent.

This was (one of?) Buterin's "original" specs:
> 1.0001 ** (number of seconds between when S1 was received and when S2 was received)
> gravity(B') = gravity(B) * 0.99 ^ n

OPTION 1 (Original ESS)
func ecbp1100AGExpA(x float64) (antiGravity float64) {
	return math.Pow(1.0001, x)

A further alternative is demonstrated with a cosin function as follows, which, being squared, has an even flatter “ramp up” section. However, since difficulty adjustment is discrete and the primary case of consideration in this context is one of equivalent difficulty, efforts toward an ultra-low antigravity product seem well-intentioned but practically ineffectual.

s(x)=8 (-cos((0.5 π x)/(6000))+1)^(2)+1

Likewise, the exponents in the currently specified polynomial may be adjusted to modify the shape of the curve (increasing them will flatten it).