**Optimal **is a fancy word used in computer science to describe the comparison between one algorithm or even data structure and another.
There are various notions of optimality one can think of. One popular notion of optimality is worst-case running time, which is what you describe:
An algorithm for solving a problem P is asymptotically optimal with respect to worst-case running time if it runs in time T(n), and any algorithm for P runs in time Ω(T(n)).
In fact, this notion is ambiguous, since it's not clear what the parameter n is. Indeed, in some cases there are several relevant parameters, which we sometimes want to consider at the same time. This is the case with graph algorithms, in which there are two natural parameters: the number of vertices and the number of edges.
An additional point of ambiguity is the computation model against which everything is measured. In theoretical work, we usually consider the word RAM as our computation model, but other models are possible.
The definition above is about the asymptotic running time. This manifests itself in two ways. First, we only care about the time complexity of the algorithm up to constant factors. Second, we only care about the behavior for large n. In practice, sometimes n is fixed, and then the definition above doesn't make any sense.
Whereas worst-case running time is a popular parameter, other parameters are also relevant. Some examples include average-case running time (whenever there is a natural input distribution) and memory consumption. Only you can say which notion of optimality is most relevant for you.
Top comments (0)