Search anything:


Binary Tree book by OpenGenus

Open-Source Internship opportunity by OpenGenus for programmers. Apply now.

Reading time: 15 minutes

BLAS (Basic Linear Algebra Subprograms) and BLIS (BLAS Like Interface Software) are libraries that have revolutionized scientific computing by accelerating execution of mathematical operations on a vast range of platforms.

In short, BLIS is the new generation alternative to BLAS and both has similar performance. The future is promising for BLIS.


The development of BLAS first started in the 1970s and the first version (Level 1 BLAS) was released in 1979. Over the 1980s, BLAS went through major development and Level 2 and Level 3 operations were added.

As the computer architecture has greatly changed since the 1970s and 1980s, BLAS needs to be replaced with a better alternatives. There have been several attempts in this path but none survived. At last, BLIS was developed in 2000s with different goals and succeed to be an alternative to BLAS.

BLIS had focus to dense linear algebra operations and portability on different architectures. It had an interface to port BLAS applications as well.


Examples of BLAS libraries are:

  • MKL (by Intel)
  • OpenBLAS

Examples of BLIS libraries are:

  • FLAME BLIS (by FLAME group at University of Texas)


Performance wise BLIS and BLAS are comparable.

MKL which is one of the most efficient BLAS library and is optimized for Intel platforms has better performance than FLAME BLIS and the difference is within 10%. This is appreciable when compared to the performance of other BLAS libraries in Intel platform.

In general, without considering the platform, BLIS performs better than BLAS libraries like OpenBLAS.


In short, we can achieve the same operations in both BLIS and BLAS.

BLAS has some direct limitations but there are workarounds as well which can make development complex.

For instance:

  • With the exception of just a few level-1 routines, BLAS does not support operations where some operands are complex and some are real For example, the update of a complex vector by applying a real triangular matrix, via the TRMV operation, is not possible. One could work around this problem by copying the real triangular matrix to a temporary complex matrix of the same size (initializing the imaginary components to zero) and then computing with this complex matrix using ztrmv. However,this approach incurs considerable cost in workspace, memory operations, and code complexity, not to mention unnecessary factor of two increase in floating-point operations.

Funding and Development

FLAME BLIS is being developed by FLAME group at University of Texas.
It is backed by grants from:

  • Microsoft
  • Intel
  • Texas Instruments
  • AMD
  • Oracle
  • Huawei
  • The National Science Foundation

BLIS has several implementations with MKL as the leader.

MKL has been developed and funded by Intel.


If one's requirements are:

  • An open-source library for scientific computing like BLAS
  • No specific platform requirement or needs to performance good in average on all platforms

then, the best option available is FLAME BLIS.

If you want to do scientific computing on Intel's platform, the best option is MKL (which is not open-source)

We need to note that FLAME BLIS is backed by several companies including Intel and hence, the future is promising for BLIS.

Interesting readings

Aditya Chatterjee

Aditya Chatterjee

Aditya Chatterjee is an Independent Algorithmic Researcher, Software Developer and Technical Author. He is the founding member of OPENGENUS, an organization with focus on changing Internet consumption

Read More

Improved & Reviewed by:

OpenGenus Tech Review Team OpenGenus Tech Review Team
Share this