Contiguous memory blocks providing O(1) indexed access -- the most fundamental building block in programming.

Theory & Description

An array is a linear data structure that stores elements in contiguous memory locations, where each element is identified by a zero-based numerical index. Because elements sit side-by-side in memory, the processor can jump directly to any position using simple pointer arithmetic, yielding O(1) random access -- the defining advantage of arrays over linked structures.

Arrays come in two flavors: static arrays have a fixed size determined at allocation time (C/C++ int arr[10]), while dynamic arrays (Python lists, Java ArrayList, C++ std::vector) automatically resize when capacity is exceeded by allocating a larger block and copying elements over. This resizing gives append operations an amortized O(1) cost, though a single append can occasionally take O(n) when a full copy is triggered.

The trade-off is that insertion or deletion in the middle requires shifting all subsequent elements, making those operations O(n). Arrays are best when the collection size is relatively stable and fast index-based lookups are the priority. They also exhibit excellent cache locality because sequential elements occupy adjacent cache lines, making iteration significantly faster in practice than pointer-based structures like linked lists.

Common Use Cases

  • Storing and accessing sequential data
  • Implementing lookup tables and buffers
  • Basis for dynamic arrays, stacks, and queues
  • Matrix and image representation
  • Sorting and binary search algorithms
  • Sliding window and two-pointer techniques

Complexity Cheat Sheet

OperationAverageWorst
AccessO(1)O(1)
SearchO(n)O(n)
Insertion (middle)O(n)O(n)
Append (end)O(1)*O(n)
DeletionO(n)O(n)
Space ComplexityO(n)

Interactive Visualization