Understanding time complexity of recursive algorithms

January 24th, 2015 @ 5PM by Boštjan Cigan Leave a comment

A time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. It is commonly estimated by counting the number of elementary operations performed by the algorithm, where an elementary operation takes a fixed amount of time to perform. Thus the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor.

Time complexity of recursive algorithms is a difficult thing to compute, but we do know two methods, one of them is the Master theorem1 and the other one is the Akra-Bazzi method2. The later uses a more mathematical based approach. In this article I will not explain what big O notation is (I am assuming that the reader already knows it), I will only explain how to use both of these methods to calculate the time complexity of recursive algorithms.

THE MASTER THEOREM

The master theorem concerns recurrence relations of the form:

T(n) = a \; T\!\left(\frac{n}{b}\right) + f(n)

where

a \geq 1, b > 1

Where

  • n is the size of the problem,
  • a is the number of subproblems in the recursion,
  • \frac{n}{b} is the size of each subproblem,
  • f(n) is the cost of the work done outside of the recursive calls (includes the cost of dividing the problem and the cost of merging the solutions of the subproblems)

That is the mathematical definition, let us put it in a more formal definition for calculating time complexity. Let

T(n) = \begin{cases} c & n = 1 \\ a \; T\!\left(\frac{n}{b}\right) + f(n) & n > 1 \end{cases}

where a \geq 1, b \geq 2, c > 0 are constants. If the following applies

f(x) = \Theta(n^d)

for some d \geq 1 then:

T(n) = \begin{cases} \Theta(n^d) & a < b^d \\ \Theta(n^d \log{}n) & a = b^d \\ \Theta(n^{\log_b a}) & a > b^d \end{cases}

Now I have always loved these formal definitions, but let us give an example:

T(n) = 2T(\frac{n}{2}) + cn, T(1) = c

The recurrence relation fits our definition above. The constants are a = 2b = 2c' = c and f(n) = cn. Now we must ask ourselves the following:

a \;?\; b^d

Which operator fits in the middle. Our function is f(n) = cn, therefore we can say that our function is f(n) = n. We check the exponent in n, so our d = 1. Now let us insert the values in the upper equation:

a \;?\; b^d = 2 \;?\; 2^1 = 2 \;?\; 2

Our operator here is therefore equals and we fall into condition two of the equation above. Our time complexity is therefore:

T(n) = \Theta(n^d \log n) = \Theta(n^1 \log n) = \Theta(n \log n)

These are the kind of examples you get everywhere. But the problem is, what do you do when you get an algorithm in front of you? Let us check the following algorithm:

We must write this algorithm into a recurrence relation of the form we have written above. How do we do that? Now let us list the things we must find in the code above:

  • find the recursion and number of times the recursion executes in each loop (number of subproblems in the recursion),
  • we must find the cost of work that is done outside of the recursion,
  • we must find the size of each subproblem

From the code above we can clearly see where the recursion is done, by calling the method algorithm in the for loop. We can see that that call is executed three times (for loop goes from 0 until it is smaller than 4) and that the size of each subproblem is \frac{n}{3} (the argument when we call the method recursively). So we can say that

a = 3, b = 3

Before the recursion call, another for loop is executed, and that is the work that is done outside the recursion call. That loop is executed n times so our work outside the recursion is n. Our function therefore equals n. So if we put it all together:

T(n) = a \; T\!\left(\frac{n}{b}\right) + f(n) = 3T(\frac{n}{3}) + n

Now lets put it all together:

  • a = 3,
  • b = 3,
  • f(n) = n,
  • d = 1 (from above function)

Let us ask ourselves the same thing as with the relation above:

a \;?\; b^d = 3 \;?\; 3^1 = 3 \;?\; 3

Our operator is again equals and we fall into the second condition. Time complexity is therefore:

T(n) = \Theta(n^d \log n) = \Theta(n^1 \log n) = \Theta(n \log n)

The master method is simple, but it has its flaws. If we have different recursions in one algorithm, we must resort to other methods for solving our time complexities.

THE AKRA BAZZI METHOD

The Akra Bazzi is used where the subproblems of our algorithm have substantially different sizes. Let:

T(x)=\begin{cases}\Theta(1) & 0 \leq x \leq x_0 \\ g(x) + \sum_{i=1}^k a_i T(b_i x + h_i(x)) & x > x_0 \end{cases}

The conditions for usage are:

  • a_1, \ldots , a_k are positive constants,
  • b_1, \ldots , b_k are constants between 0 and 1,
  • | g(x) | = \mathcal{O}(x^c) for some c \in \mathbb{N},
  • (| h_i(x)|) = \mathcal{O}((\frac{x}{(\log x)^2})) for each i = 1 \ldots k,
  • x_0 \geq \max\{b_i, \frac{1}{b_i}\}  for each i = 1 \ldots k.

If all those conditions are met, then we can say that:

T(x) \in \Theta \left( x^p\left( 1+\int_1^x \frac{g(u)}{u^{p+1}}du \right)\right)

where p solves the equation

\sum_{i=1}^k a_i b_i^p = 1

Again, while I am fond of definitions, it is the examples that teach us the most. Let us take two different examples, one where the recurrence relation is set and one where we get an algorithm.

T(n) = 2T(\frac{n}{4}) + 3T(\frac{n}{6}) + n \log n

Here k = 2, a_1 = 2b_1 = \frac{1}{4}a_2 = 3b_2 = \frac{1}{6}, our function g(n) = cn \log nh_1(n) = h_2(n) = 0. The constants suffice our conditions. The function is upper asymptotically bound with a polynom (for example g(n) = \mathcal{O}(n^2). The functions h_i(n) are constant and are therefore asymptotically bounded by \frac{n}{\log^2 n}. Our last condition states that our recurrence relation must be defined and final for the first six cases.

Let us find the values for p that solves the equation a_1b_1^p + a_2b_2^p = 1. The solution is p = 1. Now we can calculate the integral.

\Theta ( x^p( 1+\int_1^x \frac{g(u)}{u^{p+1}}du )) = \Theta ( x ( 1+ c \frac{\log^2 x}{2})) = \Theta(n \log^2 n)

Note that I have omited some steps in calculating the value of the integral, it can be solved by using the substitution method m = \log u, dm = \frac{1}{u} du.

Now lets take on a practical example. Take a look at the following algorithm:

The procedure that we will take is exactly the same as the one with the master theorem. We have three recursive calls in the function, but of two of them have the same subproblem size, so we will just multiply the two. The work done outside the recursion equals n^2 (the condition where i is smaller than n square, so the loop is executed n square times). Our recurrence relation is therefore:

T(n) = T(\frac{n}{2}) + 2T(\frac{n}{4}) + n^2

Let us set the equation:

(\frac{1}{2})^p + 2(\frac{1}{4})^p = 1

We only took the numbers that had T in the upper equation and multiplied them by the times the recursion is repeated. The solution to the equation is p = 1. Now to solve it, we must insert it in the formula above:

\Theta ( x^p( 1+\int_1^x \frac{g(u)}{u^{p+1}}du )) = \Theta ( x^1( 1+\int_1^x \frac{u^2}{u^{1+1}}du )) =
\Theta ( x( 1+\int_1^x \frac{u^2}{u^{2}}du )) = \Theta ( x( 1+\int_1^x 1 \ du )) =
\Theta ( x(1+(x-1))= \Theta (x(x)) = \Theta(x^2) = \Theta(n^2)

Note that this was an easy one. Equations can get a lot tougher and knowledge of integrating functions is the basis here, so make sure you pay attention in your computer science calculus class.

REFERENCES
  1. Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. Introduction to Algorithms, Second Edition. MIT Press and McGraw-Hill, 2001. ISBN 0-262-03293-7. Sections 4.3 (The master method) and 4.4 (Proof of the master theorem), pp. 73–90.  []
  2. Mohamad Akra, Louay Bazzi: On the solution of linear recurrence equations. Computational Optimization and Applications 10(2):195–210, 1998.  []
Filled under: Algorithms Tagged under: akra bazzi, algorithms complexity, big o notation, master theorem, o notation, recursive algorithms, recursive complexity, recursive relations, time complexity

Backing up your server to Dropbox

May 20th, 2014 @ 1PM by Boštjan Cigan Leave a comment

Dropbox is a very useful tool for syncing various content between different devices. But you can also use for so much more. One of the things I am currently using it for is backing up the contents of my MySQL databases and server contents. It also isn’t a hard thing to achieve. What we are going to do in this tutorial is the following:

  • Install Dropbox on a VPS (virtual private server)
  • Create two bash scripts (one for backing up server contents and one for backing up MySQL databases)
  • Create two cron tasks that run the two scripts

INSTALLING DROPBOX

Installing Dropbox from the terminal is very easy. All you need to do is paste one of the following in the terminal:

For 32 bit machines:

For 64 bit machines:

Now we need to run the daemon:

When ran you will get the following output:

Copy the link and paste it in your browser. Now press CTRL+C and run the following again:

Your Dropbox daemon should now be running.

CREATING BASH SCRIPTS

First we need to create a bash script to backup our databases to Dropbox.

Essentially what this script does it checks our Dropbox folder (the subfolder dbbackups) and checks for files that are older than three days (you can also change that value by changing +3 to any value you’d like) and deletes them. Then it creates a new folder with the datestamp and then the databases are backed up (and compressed). Bash is just fun with the things you can create with it. If you are in a root console, type in the following commands:

Paste the contents of the script in the terminal window and replace the string YOURPASSWORD with your root database password. Then type CTRL+X and then Y and ENTER. Now type in:

Now your script is complete and ready. Lets create a script that backups all of our server content too.

Again type in the following:

Paste the contents of the script above and replace the /usr/share/nginx/www with the path to where your server is setup (usually /var/www or something similar). Now type CTRL+X and then Y and ENTER. Again, we will need to set permissions to the file:

CREATING THE CRON TASKS

Our scripts are now created. Now we need to make sure they are ran. I’ve setup my database backup to run daily (at midnight) and my server backup every three days. It depends on how fast the content on your server changes and how much of it you have (Dropbox’s free account is limited to 2GB of space anyway). You can teach yourself how cron works, but people nowadays have already create tools that generate the strings required for a cron task, one of them is here. Type the following in the terminal:

A window will open, paste the following after all the comments (last line containing a #):

The first one runs our database backup script every day at one AM, while the server backups runs every three days at midnight.

Filled under: Bash Tagged under: backup server dropbox, dropbox backup, dropbox console, dropbox cron, dropbox server, dropbox ubuntu, server auto backup, server backup, server backup sync
  • Twitter feed
    @bostjan_cigan

    Freelance software and web developer who also gets a kick out of cooking.

    So glad to be a part of @Thinkful which just turned 5! Fellow mentor @sarajchipps says it best: “So many of us [dev… twitter.com/i/web/status/9…

    @lottapub Hi Lotta, for some reason haven't seen this, the plugin still works as should so no updates are needed. I… twitter.com/i/web/status/9…

    @Medium Will you be providing any other pay methods for founding members? Like PayPal?

    @duetdisplay Caught my eye, purchased it, works great. Will there ever be partial WiFi support?