# Focus the growth rate the running time function the input size

Simplifying the Analysis Further

In algorithm analysis, we focus on the growth rate of the running time as a function of the input size

n, taking a "big-picture" approach, rather than being bogged down with small details. It is often enough just to know that the running time of an algorithm such as arrayMax, given inSection 1.9.2,grows

, with its true running time beinproportionally tongntimes a constant factor that depends on the specific computer.

f(n) ≤cg(n), forn≥n0.This definition is often referred to as the "big-Oh" notation, for it is sometimes pronounced as "

f(n) isofbig-Ohg(n)." Alternatively, we can also say "f(n) is(order ofgn)." (This definition is illustrated inFigure 4.5.)

Example 4.6:The function8n− 2isO(n).

Justification:By the big-Oh definition, we need to find a real constantc> 0 and an integer constantn0 ≥ 1 such that 8n− 2 ≤cnfor every integern≥n0. It is easy to see that a possible choice isc= 8 andn0 = 1. Indeed, this is one o f infinitely many choices available because any real number greater than or equal to 8 will work forc, and any integer greater than or equal to 1 will work forn0The big-Oh notation is used widely to characterize running times and space bounds in terms of some parameter

n, which varies from problem to problem, but is always defined as a chosen measure of the "size" of the problem. For example, if we are interested in finding the largest element in an array of integers, as in the arrayMax algorithm, we should letndenote the number of elements of the array. Using the big-Oh notation, we can write the following mathematically precise statement on the running time of algorithm arrayMax forcomputer.any

230