O(log n) is a notation used in algorithm analysis to describe the time complexity or performance of an algorithm. It signifies that the algorithm’s running time increases logarithmically as the input size (n) increases. In simpler terms, as the input grows, the time it takes to process the data grows much slower compared to linear or quadratic time complexities. This is common in algorithms that repeatedly halve the input data, such as binary search. For example, in a sorted list, each step of binary search reduces the size of the list to search by half, making it very efficient for large inputs.