D/L Method vs VJD Method: A perfect system for imperfect reality?

In 1998, India were part of a tri-series in Sri Lanka, with New Zealand the third team. In a league match, the Indian bowlers restricted New Zealand to 219 for eight. In reply, India were cruising at 131 for two in 24.2 overs when rain forced the match to be abandoned. According to the existing rules, the team batting second had to bat 25 overs in a rain-affected match to constitute a completed ODI match, so points were shared.

Incredibly enough, had four more balls been bowled, New Zealand would have fancied their chances more than India, even though India needed just 88 more runs with 25.4 overs and eight wickets to go, at a required run-rate of 3.43 per over. Under the rain rules at that point, India needed to have scored 147 in 25 overs. The archaic rule thus demanded that two-thirds of the runs required be scored in half the overs, with only two wickets lost.

The 1992 World Cup semifinal between England and South Africa is often cited as the major reason for cricket needing a better rain rule, but until the Duckworth-Lewis system (D/L) was introduced in 1999, there were many other absurd results that the game lived with. The D/L method liberated cricket from the most glaring inequalities with a system that was elegant, took overs left and wickets lost into account, and most importantly, didn’t deliver results that seemed intuitively wrong.

The only serious challenger to the D/L method has been developed by V Jayadevan, a Kerala-based engineer. For most people who follow the game, both methods are still somewhat mysterious. That they are different is obvious, but given the raging debate about which is ‘better’, how are they different?

The D/L method has essentially combined wickets in hand and overs remaining for a limited-overs match into one measure, which is called the ‘Resource Percentage’ (RP). D/L measures the RP left at any point of an innings for a given number of overs left and wickets in hand. Why D/L has worked so well – particularly in a 50-over context – is because it has hit upon an equation which maps how much of a team’s resources are lost at any given point very effectively.

It isn’t without flaws. The original versions of D/L placed too much emphasis on wickets in hand – for example, a score of 94 for no loss after 25 overs would win against a first innings score of 300. This has been revised in later editions,  but the basic principle remains the same. D/L uses an exponential decay function to denote the steady loss of resources for a team as its overs and/or wickets wind down.

Jayadevan’s method (VJD) is very different. While D/L has one curve, Jayadevan has two. He uses regression (the explanation of which is beyond the scope of this article) and arrives at a cubic polynomial equation – or in layman’s terms an equation that takes the form ax3 + bx2 + cx + d. The shape of a cubic polynomial equation is akin to that of a section of a sine curve. Jayadevan argues that this mirrors a typical ODI innings more accurately, with acceleration at the start followed by a deceleration and another final acceleration. VJD also has a table for ready reckoning, though this gives values in terms of percentages of wickets and overs lost, and returns a par score for the given state of the match.

For matches that follow a normal course, both methods give results that are fairly similar. For example, if the team batting first (Team A) scores 280 and the team batting second (Team B) has only 40 overs to bat, the standard edition of D/L pins the target as 251, while VJD’s latest method has it at 244. Take the case of Team A’s innings stopping at 150 for 4 in 35 overs, with Team B getting 35 overs to bat. The standard edition of D/L has Team B’s target as 210, while VJD gives a target of 198. So far, so good.

The problems with application arise in Twenty20 cricket. Spread over 50 overs, the game gives enough scope to make reasonable approximations, but as the game gets more compressed, the same approximations don’t work. Twenty20 itself is an evolving format, and neither state, nor franchise nor international teams have mastered how best to approach an innings in the same way as they have ODIs. With the game itself evolving, any method that approximates a result in case of a rain-shortened match is bound to encounter teething problems.

The most famous case of D/L going awry in a T20 match was in the World T20 2010 match between England and the West Indies. England made a formidable 191 for five, and a rain interruption meant D/L came into play. The West Indies’ revised target was 60 from six overs – clearly an easier task than 192 in 20. VJD would have given the target as 64 – not much of a difference.

VJD has worked well in some of the limited matches in which it has been applied, notably in the Syed Mushtaq Ali Trophy and the now defunct Indian Cricket League – but in many T20 scenarios, D/L has also worked well. Importantly, faced with the extreme scenario of a very good 20-over score compressed into a 6-over target, its results weren’t too different from D/L.

Perhaps, there’s a case for the ICC to reconsider what constitutes a complete match in the case of T20 cricket. If 20 overs is the base taken for ODIs, then by a similar yardstick, 8 overs should be the minimum needed to decide the result of a T20 match.

To come back to the note this article started on, both D/L and VJD would have placed India as comfortable victors in that 1998 match against New Zealand, and by very similar margins. VJD had a par score of 81 and D/L had it at 85. Whichever method is eventually used, cricket fans can be thankful that at least obvious anomalies have been eradicated.