# Evolution of Integer Multiplication Algorithms (from 1960 with end in 2019)

Get FREE domain for 1st year and build your brand new site

We started with an O(N^{2}) time Integer Multiplication algorithm and it was the first time ever in 1960 that we developed an faster Integer Multiplication algorithm which ran at O(N^{1.58}) time and it was proved (in 1970) that the fastest possible algorithm would run at O(N logN) time. Note that all complexity is for multiplication of 2 N-digit numbers.

It took us over 60 years to go from O(N^{2}) to O(N logN) but it has been an interesting journey. We are living in an important time as this could be **one of the few fundamental topics** that we are able to optimize to the limit and understand it deeply.

In 1971, Schonhage Strassen algorithm was developed which ran at O(N * logN * loglogN) time and held the record for 36 years before being beaten by Furer's algorithm in 2007. From then, progress has been constant which a O(N logN) algorithm discovered in March 2019 which is the possible end of this human quest.

Following summarizes the algorithms that defined this era:

Algorithm | Complexity | Year | Notes |
---|---|---|---|

School Multiplication | O(N^{2}) |
100 BC | - |

Russian Peasant Method | O(N^{2} * logN) |
1000 AD | - |

Karatsuba algorithm | O(N^{1.58}) |
1960 | - |

Toom Cook multiplication | O(N^{1.46}) |
1963 | - |

Schonhage Strassen algorithm | O(N * logN * loglogN) | 1971 | FFT |

Furer's algorithm | O(N * logN * 2^{O(log*N)}) |
2007 | - |

DKSS Algorithm | O(N * logN * 2^{O(log*N)}) |
2008 | Modular arithmetic |

Harvey, Hoeven, Lecerf | O(N * logN * 2^{3 log*N}) |
2015 | Mersenne primes |

Covanov and ThomÃ© | O(N * logN * 2^{2 log*N}) |
2015 | Fermat primes |

Harvey and van der Hoeven | O(N * logN) | March 2019 | Possible end |

Note that the time complexity is for multiplying two N digit numbers.

# Basic approaches O(N^2)

Integer Multiplication starts with the basic approach that is taught in school that has a time complexity of O(N^{2}). Though it took a significant time improve the time complexity but this did not stop us to make improvements over this.

One improvement was to reduce multiplication to N additions as in computing systems, addition is much faster. This may not be true in modern systems as several optimization come into picture.

Another approach was to do multiplication in powers of two. Though it increases the time complexity but the real performance is good as multiplication by 2 is done using a left shift operation which takes constant O(1) time. Read more about Russian Peasant Algorithm at OpenGenus.

There are several other variants but the real progress was being made from 1960s where we discovered several great algorithms over the next half century.

# 1960s: O(N^2) to O(N^1.58) to O(N^1.46)

Everything started with **Karatsuba algorithm** which was the **first algorithm** to show that Integer Multiplication can be done faster than O(N ^{2}). It was at a time when scientists where stuck in this field.

Karatsuba algorithm was discovered in 1960 by Anatoly Karatsuba. It was based on a divide and conquer approach and had a time complexity of O(N^1.58).

The next progress was made quickly in 1963 by coming up with Toom Cook multiplication. It had a time complexity of O(N^1.46).

This field was in active research and the news of recent progress spread like wild fire even in those days of no internet. It took around 7 years to make a huge impact in the field which shook everyone for decades.

# 1970s: The beginning of a new challenge

Schonhage Strassen algorithm is one of the greatest progress made in this domain of Integer Multiplication. It was formulated in 1971 and remained the **fatest Integer Multiplication algorithm for over 36 years**.

It has a time complexity of O(N * logN * loglogN) and uses the idea of Fast Forier Transform.

Though this was a major step, it took several years to improve over this. This gave a sense that the domain was getting too complicated to tackle but as we know, we eventually did it.

# 2007: Break-through

Furer's algorithm was a major breakthrough as no fundamental progress was made from 1971 to 2007. It showed that further progress is possible. It improved the time complexity to O(N * logN * 2^{O(log*N)}).

It improved the loglogN part of Schonhage Strassen algorithm which is true for large numbers such as 2^{264}.

Despite this, it remain of theoretical interest only because of several significant challenges to make its use in practical applications. This opened up a whole new interest in the domain and in the following years several optimizations where proposed but none made it suitable for practical use. Hence, Schonhage Strassen algorithm continued to be used in all practical uses.

# Further improvements till O(N logN)

Several improvements on Furer's Algorithms have been done since 2007.

DKSS Algorithm was a notable approach as it achieved the same time complexity as Furer's algorithm. It relied on modular arithmetic and is simpler. It came out in 2008 and has a time complexity of O(N * logN * 2^{O(log*N)}). This is faster than Schonhage Strassen algorithm for numbers greater than 10^{104796}.

In 2015, Harvey, Hoeven, Lecerf came up with an algorithm with a better bounded constant as compared to Furer's Algorithm. It relied on Messene primes and had a time complexity of O(N * logN * 2^{3 log*N}) where the constant is 3 while in Furer's algorithm, it is not bounded and can be larger like 8.

Soon, Covanov and ThomÃ© in the same year 2015, came up with another algorithm based on Fermat Primes and improved the constant factor to 2. The time complexity improved to O(N * logN * 2^{2 log*N}).

Despite these improvements, the algorithms were not suitable for practical use and minimal improvements were being made. To a positive note, we have several algorithms with different basic ideas.

The major progress have been made in March 2019 by Harvey and van der Hoeven. They have proposed an algorithm with time complexity of O(N logN). This is significant as in 1971, Volker Strassen said that the possible best complexity for Integer Multiplication should be O(N logN) and we have reached the end. It is being verified.

Though several different approaches will come in years to follow, we have come a long way and optimized this fundamental operation to its limit.

# Read some papers on Integer Multiplication

All Research papers on Integer Multiplication (1960 to 2019) by OpenGenus

DKSS Algorithm (PDF) in 2008

Harvey, Hoeven, Lecerf Algorithm (PDF) in 2015

With this, you have the complete overall knowledge of this fundamental domain of Integer Multiplication. Enjoy and keep learning.