Get this book -> Problems on Array: For Interviews and Competitive Programming

In this article, we have presented the **Time Complexity analysis of different operations in Array**. It clears several misconceptions such that Time Complexity to access i-th element takes O(1) time but in reality, it takes O(βN) time. We have presented space complexity of array operations as well.

**Table of contents**:

- Introduction to Array Data Structure
- Time Complexity Analysis of Array
- Space Complexity of Array

Let us get started with the Complexity Analysis of Array.

# Introduction to Array Data Structure

Array is a linear data structure where elements are arranged one after another. An array is denoted by a memory address M which is the memory address of the first element.

In this view, the memory address of ith element = **M + (i-1) * S**

where:

- M is the memory address of first element
- S is the size of each element.

Note: all elements of an array are of the same size.

Array comes in different dimensions like 2D array and 3D array. 2D arrays is an array where each element is a 1D array.

Every D-dimensional array is stored as a 1D array internally. Elements of D-dimensional array are arranged in a 1D array internally using two approaches:

- Row Major
- Column Major

In Row Major, each 1D row is placed sequentially one by one.

Similarly, in Column Major, each 1D column is placed sequentially one by one.

Based on this, you can find the memory address of a specific element instantly.

# Time Complexity Analysis of Array

From this article, we learnt that fetching an element at a specific memory address takes O(βN) time where N is the block of memory being read.

Once the block of memory is in RAM (Random Access Memory) accessing a specific element takes constant time because we can calculate its relative address in constant time.

Bringing the block of memory from external device to RAM takes O(βN) time. As array elements are contiguous in memory, this operation takes place only once. Hence, it is reasonable to assume the time complexity to access an element to be O(1).

Over-writing an element at a specific index takes constant time O(1) because we need to access the specific index at the relative address and add new element. This is same as accessing an element.

Note: Even in this operation, we need to load the array from external device that consumes O(βN) time.

Inserting and deleting elements take linear time depending on the implementation. If we want to insert an element at a specific index, we need to skip all elements from that index to the right by one position. This takes linear time O(N).

If we want to insert element at end of array, we can do it in constant time as we can keep track of length of array as a member attribute of array. This approach is taken by standard array implementation in Java.

Similar is the approach with delete operation in array.

The Time Complexity of different operations in an array is:

Array operation | Real Time Complexity | Assumed Time Complexity |
---|---|---|

Access i-th element | O(βN) | O(1) |

Traverse all elements | O(N + βN) | O(N) |

Override element at i-th index | O(βN) | O(1) |

Insert element E | O(N + βN) | O(N) |

Delete element E | O(N + βN) | O(N) |

# Space Complexity of Array

The Space Complexity of the above array operations is O(1).

This is because we do not need extra space beyond a fixed number of variables.

For some operations, you may need extra space of the order of O(N). For example, sorting an array using a sorting algorithm that is not in-place.

With this article at OpenGenus, you have the perfect idea of the Time Complexity of Array.