Data Structure and Algorithms Using C++. Sachi Nandan Mohanty

Читать онлайн книгу.

Data Structure and Algorithms Using C++ - Sachi Nandan Mohanty


Скачать книгу
alt=""/>

       Introduction

      An important question is: How efficient is an algorithm or piece of code? Efficiency covers lots of resources, including:

      CPU (time) usage

      Memory usage

      Disk usage

      Network usage

      Be careful to differentiate between:

      Performance: how much time/memory/disk/... is actually used when a program is running. This depends on the machine, compiler, etc., as well as the code.

      Complexity: how do the resource requirements of a program or algorithm scale, i.e., what happens as the size of the problem being solved gets larger. Complexity affects performance but not the other way around. The time required by a method is proportional to the number of “basic operations” that it performs. Here are some examples of basic operations:

      one arithmetic operation (e.g., +, *). one assignment one test (e.g., x == 0) one read one write (of a primitive type)

       Note: As an example,

      O(1) refers to constant time.

      O(n) indicates linear time;

      O(nk) (k fixed) refers to polynomial time;

      O(log n) is called logarithmic time;

      O(2n) refers to exponential time, etc.

       n2 + 3n + 4 is O(n2), since n2 + 3n + 4 < 2n2 for all n > 10. Strictly speaking, 3n + 4 is O(n2), too, but big-O notation is often misused to mean equal to rather than less than.

      In general, how can you determine the running time of a piece of code? The answer is that it depends on what kinds of statements are used.

       1. Sequence of statements

       statement 1; statement 2; ... statement k;

      total time = time(statement 1) + time (statement 2) + ... + time(statement k)

      If each statement is “simple” (only involves basic operations) then the time for each statement is constant and the total time is also constant: O(1). In the following examples, assume the statements are simple unless noted otherwise.

       2. if-then-else statements

       if (cond) { sequence of statements 1 } else { sequence of statements 2 }

      Here, either sequence 1 will execute, or sequence 2 will execute. Therefore, the worst-case time is the slowest of the two possibilities: max(time(sequence 1), time(sequence 2)). For example, if sequence 1 is O(N) and sequence 2 is O(1) the worst-case time for the whole if-then-else statement would be O(N).

       3. for loops

       for (i = 0; i < N; i++) { sequence of statements }

      The loop executes N times, so the sequence of statements also executes N times. Since we assume the statements are O(1), the total time for the for loop is N * O(1), which is O(N) overall.

       4. Nested loops

       for (i = 0; i < N; i++) { for (j = 0; j < M; j++) { sequence of statements } }

       5. Statements with method calls:

      When a statement involves a method call, the complexity of the statement includes the complexity of the method call. Assume that you know that method f takes constant time, and that method g takes time proportional to (linear in) the value of its parameter k. Then the statements below have the time complexities indicated.

       f(k); // O(1) g(k); // O(k)

      When a loop is involved, the same rule applies. For example:

       for (j = 0; j < N; j++) g(N);

      has complexity (N2). The loop executes N times and each method call g(N) is complexity O(N).

       Examples

       Q1. What is the worst-case complexity of the each of the following code fragments?

      Two loops in a row:

       for (i = 0; i < N; i++) { sequence of statements } for (j = 0; j < M; j++) { sequence of statements }

      Answer: The first loop is O(N) and the second loop is O(M). Since you do not know which is bigger, you say this is O(N+M). This can also be written as O(max(N,M)). In the case where the second loop goes to N instead of M the complexity is O(N). You can see this from either expression above. O(N+M) becomes O(2N) and when you drop the constant it is O(N). O(max(N,M)) becomes O(max(N,N)) which is O(N).

       Q2. How would the complexity change if the second loop went to N instead of M?

      A nested loop followed by a non-nested loop:

       for (i = 0; i < N; i++) { for (j = 0; j < N; j++) { sequence of statements } } for (k = 0; k < N; k++) { sequence of statements }

      Answer: The first set of nested loops is O(N2) and the second loop is O(N). This is O(max(N2,N)) which is O(N2).

       Q3. A nested loop in which the number of times the inner loop executes depends on the value of the outer loop index:

       for (i = 0; i < N; i++) { for (j = i; j < N; j++) { sequence of statements } }

      Answer: When i is 0 the inner loop executes N times. When i is 1 the inner loop executes N-1 times. In the last iteration of the outer loop when i is N-1 the inner loop executes 1 time. The number of times the inner loop statements execute is N + N-1 + ... + 2 + 1. This sum is N(N+1)/2 and gives O(N2).

       Q4. For each of the following loops with a method call, determine the overall complexity. As above, assume that method f takes constant time, and that method g takes time linear in the value of its parameter.

      a. for (j = 0; j < N; j++) f(j); b. for (j = 0; j < N; j++) g(j); c. for (j = 0; j < N; j++) g(k);

      Answer: a. Each call to f(j) is O(1). The loop executes N times so it is N x O(1) or O(N).

      b. The first time the loop executes j is 0 and g(0) takes “no operations.” The next time j is 1 and g(1) takes 1 operations. The last time the loop executes j is N-1 and g(N-1) takes N-1 operations. The total work is the sum of the first N-1 numbers and is O(N2).

      1 What


Скачать книгу