Running The Symfony 2 Benchmarks

By | February 21, 2010

Fabien Potencier released Symfony 2.0.0alpha1 last week, along with some benchmarks showing its performance. I am glad to see that Fabien used my benchmarking system and methodology, and am happy to see that he is paying attention to the performance of his framework. I take this as an acceptance on his part that my methodology is legitimate and valid, and that it has value when comparing framework responsiveness.

However, in attempting to reproduce the published Symfony 2 benchmarking results, I found Fabien’s reporting to be inaccurate (or at least incomplete). Read on for a very, very long post detailing my attempt to replicate his results for the “hello world” basic framework overhead comparison, and my conclusions.

For the impatient, here are my conclusions in advance:

  1. Fabien’s benchmark report, as shown at http://symfony-reloaded.org/fast, is inaccurate for the setup he describes. Lithium and Flow3 do not work in Fabien’s benchmark codebase at Github. Also, Symfony 2 is faster than Solar beta 3 by 5%, not 20%, on a “c1.xlarge” instance; to get a relative difference like Fabien describes, one has to use an “m1.large” instance. (It is entirely possible that the process Fabien used for benchmarking is incompletely described, and that the codebase is not fully updated, thus contributing to this disparity in results.)

  2. We should use Siege 2.69, not 2.66, for more accurate benchmarking of baseline responsiveness. If we notice that HTML is slower than PHP, it’s a sign that something is wrong.

  3. Symfony 2 preloads its foundation and application classes, something no other framework does in the benchmarked code. When we treat Solar and Symfony 2 the same way, by preloading the foundation classes for each, we find that Solar is roughly 28% faster than Symfony 2.

Overview

Full disclosure: I am the architect of the Solar Framework for PHP discussed herein. I have been doing these benchmarks for years; see the “benchmarks” tag on this blog.

The primary point of this post is to show that benchmarking is tedious, time-consuming, difficult, and requires a lot of attention to details. I spent two full days doing all the following work, not including the time spent writing this post. It’s easy to get things wrong in the benchmarking itself, and it’s easy to get things wrong when reporting the results. Transparency, honest dealing, and a commitment to intellectual integrity (what Feynman called “a principle of scientific thought that corresponds to a kind of utter honesty–a kind of leaning over backwards”) — these things are key.

The secondary point of this post is to show that Solar is in fact more-responsive than Symfony 2 when they are treated alike, even under Fabien’s test conditions.

Note that this benchmarking series uses the codebase for Fabien’s Symfony 2 benchmarks; this is not part of the official web-framework-benchmarks series, as the tested conditions in Fabien’s code are somehwat different.

These are the major portions of this post:

  1. We run Fabien’s benchmarks using Siege 2.66 on an Amazon ec2 “c1.xlarge” instance using his instructions and codebase. Since Fabien left out the static HTML baseline target, we will add it ourselves for comparison. We will find that his initial report is inaccurate; two frameworks are non-responsive, and the difference between Solar and Symfony is much less than reported.

  2. We will attempt to run the same series using Siege 2.69. We will find that it fails because of socket unavailability.

  3. We backtrack a bit and run the benchmarks on a “m1.large” instance, using Siege 2.66 to ascertain the original scenario.

  4. We run the same series using Siege 2.69, and find the same relative performance ranking as with 2.66, but with lower percent-of-PHP numbers, because Siege 2.69 reports higher (and more believable) baseline numbers.

  5. Finally, we show that Symfony 2 uses a preloaded classes file. When we do the same for Solar and re-run the benchmarks, we find that Solar is more responsive than Symfony 2 by roughly 28%.

Fabien’s inital benchmark report is at http://symfony-reloaded.org/fast.

The code he benchmarked against is at http://github.com/fabpot/framework-benchs.

His instructions for reproducing his results are at http://github.com/fabpot/framework-benchs/blob/master/replicating.markdown.

For reference, here are the numbers Fabien initially reported (alphabetized by framework):

framework            |      rel |      avg |        1 |        2 |        3 |        4 |        5
-------------------- | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-php         |   1.0000 |  5465.30 |  4602.06 |  5509.34 |  5694.15 |  6232.73 |  5288.23
cakephp-1.2.6        |   0.0513 |   280.43 |   255.91 |   279.50 |   291.80 |   291.13 |   283.83
flow3-1.0.0alpha7    |   0.0048 |    26.29 |    23.87 |    26.97 |    26.67 |    26.93 |    27.02
lithium-0.6          |   0.2128 |  1163.27 |  1059.44 |  1179.42 |  1180.52 |  1197.73 |  1199.25
solar-1.0.0beta3     |   0.2825 |  1544.14 |  1293.81 |  1596.28 |  1601.55 |  1613.20 |  1615.86
symfony-1.4.2        |   0.1737 |   949.59 |   916.84 |   944.49 |   953.88 |   967.52 |   965.24
symfony-2.0.0alpha1  |   0.3312 |  1810.07 |  1693.15 |  1846.41 |  1827.51 |  1856.98 |  1826.30
yii-1.1.1            |   0.1901 |  1038.77 |  1033.20 |  1037.60 |  1038.47 |  1041.57 |  1043.01
zend-1.10            |   0.0906 |   494.90 |   320.74 |   519.74 |   537.15 |   546.11 |   550.76

Finally, for those who wish to follow along, the scripts for each of the major sections of this post are available here: http://paul-m-jones/public/fabiens-benches.sh


Part 1

We set up a “c1.xlarge” instance per the instructions from Fabien, and run siege.php against the targets file.

Flow3 had an exception:

<h1>500 Internal Server Error</h1>
<p>FLOW3 experienced an internal error (uncaught exception):</p>
<p>PDOException</p>

Looks like PDO has to be loaded for Flow3.

Lithium had an error too:

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /lithium-0.6/app/webroot/hello/Fabien was not found on this server.</p>
<hr>
<address>Apache/2.2.9 (Debian) PHP/5.3.1-0.dotdeb.1 with Suhosin-Patch Server at localhost Port 80</address>
</body></html>

This is an Apache 404 error; the target line for Lithium appears to be wrong.

Therefore, we can ignore those two frameworks in our results. The report looks like this:

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   0.9002 |  5239.91 |  4438.38 |  5458.72 |  5822.69 |  5230.76 |  5249.02
baseline-php             |   1.0000 |  5821.08 |  4950.66 |  5763.70 |  5729.34 |  5939.63 |  6722.07
cakephp-1.2.6            |   0.1022 |   594.87 |   568.08 |   597.78 |   603.13 |   603.88 |   601.48
flow3-1.0.0alpha7 *      |   0.0008 |     4.40 |     3.41 |     4.55 |     4.64 |     4.72 |     4.70
lithium-0.6 *            |   1.0119 |  5890.55 |  5436.60 |  5326.39 |  6037.85 |  5996.23 |  6655.69
solar-1.0.0beta3         |   0.2441 |  1420.88 |  1323.49 |  1404.17 |  1416.07 |  1483.37 |  1477.30
symfony-1.4.2            |   0.0876 |   509.73 |   493.31 |   506.76 |   517.65 |   514.61 |   516.33
symfony-2.0.0alpha1      |   0.2573 |  1497.54 |  1242.21 |  1433.91 |  1607.85 |  1626.23 |  1577.50
yii-1.1.1                |   0.1360 |   791.69 |   790.61 |   787.22 |   795.12 |   793.43 |   792.08
zend-1.10                |   0.0769 |   447.62 |   382.17 |   448.55 |   473.18 |   471.33 |   462.88

(* ignore)

Contrary to Fabien’s report, we see that Symfony 2 is not “20% faster” than Solar. Symfony 2 at .2573, and Solar at .2441, is more like a 5% difference, with Symfony 2 in the lead.

However, the baseline PHP response was 10% faster than a static HTML page (where the PHP engine is not invoked at all). This indicates something is wrong with the benchmarking environment. I saw similar behavior when using ab (the Apache benchmark tool) and it looks like Siege 2.66 has the same erroneous behavior. Let’s switch to the more-recent Siege version 2.69 and see if we can eliminate that.

Part 2

We remove Siege 2.66, install Siege 2.69, and attempt to run the benchmarks again.

The problem is, even with the ulimit set as high as it is, Siege 2.69 floods the server and we get tons of “error: socket: 1168148816 address is unavailable.: Cannot assign requested address” errors.

We can set 'failures' => 1048576 in siege.php (i.e., the same as the ulimit value) to try and ignore the socket errors. However, even at that high value, we still can’t get through the baseline html response; socket availability is still too low.

As such, we will step down to an “m1.large” instance for the remainder of the process. I know from previous experience that Siege will not exceed the socket availability on this kind of instance.

Part 3

We terminate the “c1.xlarge” instance and run an “m1.large” instance in its place. Now that we’re on a new instance, we need to re-run Fabien’s benchmark series using Siege 2.66 again to make sure the errors we received before are not instance-type specific. When we do, we get these results (Flow3 and Lithium show the same errors as before):

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   0.9658 |  2424.05 |  2430.61 |  2452.35 |  2357.62 |  2431.77 |  2447.89
baseline-php             |   1.0000 |  2509.98 |  2548.19 |  2517.85 |  2509.79 |  2439.35 |  2534.70
cakephp-1.2.6            |   0.0748 |   187.65 |   187.34 |   188.88 |   187.83 |   187.62 |   186.57
flow3-1.0.0alpha7 *      |   0.0004 |     1.07 |     1.12 |     1.02 |     1.04 |     1.11 |     1.06
lithium-0.6 *            |   1.0639 |  2670.45 |  2653.04 |  2714.18 |  2659.09 |  2662.56 |  2663.39
solar-1.0.0beta3         |   0.1944 |   487.88 |   486.83 |   478.97 |   493.04 |   489.55 |   491.03
symfony-1.4.2            |   0.0810 |   203.22 |   204.22 |   204.79 |   204.08 |   201.49 |   201.53
symfony-2.0.0alpha1      |   0.2330 |   584.73 |   582.53 |   578.17 |   589.57 |   588.38 |   584.98
yii-1.1.1                |   0.1463 |   367.16 |   362.83 |   373.34 |   362.67 |   377.57 |   359.41
zend-1.10                |   0.0542 |   135.99 |   135.06 |   135.28 |   137.09 |   136.67 |   135.87

(* ignore)

Now we see a difference in the Solar and Symfony 2 numbers that looks like Fabien’s original reporting; Symfony 2 at 0.2330 is about 20% faster than Solar at 0.1944.

But we still see the same error condition of PHP looking like it runs faster than static HTML. Let’s move away from Siege 2.66 and try Siege 2.69 on this smaller instance.

Part 4

We remove Siege 2.66, install Siege 2.69, and re-run the new Siege against the same targets on the same “m1.large” instance.

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   1.1710 |  5594.40 |  5732.63 |  5610.27 |  5769.64 |  5663.68 |  5195.80
baseline-php             |   1.0000 |  4777.65 |  4853.58 |  4729.50 |  4721.51 |  4772.37 |  4811.31
cakephp-1.2.6            |   0.0401 |   191.71 |   186.32 |   192.19 |   190.43 |   194.06 |   195.53
flow3-1.0.0alpha7 *      |   0.0000 |     0.00 |     0.00 |     0.00 |     0.00 |     0.00 |     0.00
lithium-0.6 *            |   1.0689 |  5107.01 |  5196.32 |  5158.93 |  4888.00 |  5092.78 |  5199.03
solar-1.0.0beta3         |   0.1126 |   537.85 |   541.37 |   537.44 |   537.34 |   536.59 |   536.52
symfony-1.4.2            |   0.0443 |   211.73 |   212.29 |   211.68 |   213.56 |   210.47 |   210.64
symfony-2.0.0alpha1      |   0.1370 |   654.65 |   655.15 |   654.62 |   653.44 |   651.52 |   658.52
yii-1.1.1                |   0.0819 |   391.44 |   396.89 |   389.75 |   390.60 |   396.30 |   383.68
zend-1.10                |   0.0293 |   140.19 |   139.83 |   138.85 |   141.18 |   140.87 |   140.24

(* ignore)

This looks more like what we should be seeing: HTML is now faster than PHP. The rankings and relative ratings appear similar to the Siege 2.66 run; Symfony 2 at .1370 is about 20% faster than Solar at .1126.

Part 5

I spent some time picking apart Symfony 2 to see what it might be doing that Solar could use for improvement. One reason for Symfony’s performance is that (in the benchmarked code) all the Symfony 2 foundation classes are concatenated into a single “bootstrap.php” file. Similarly, Symfony 2 caches its application classes into another single file (hello/cache/prod/classes.php). From what I can tell, none of the other frameworks are doing anything like that; they are reading class files individually as needed.

Let’s see if we can even out the Solar vs. Symfony 2 playing field. In this final benchmarking pass, we get the latest trunk code of Solar, compile the Solar foundation classes into a preload file just like Symfony’s, and use that preload file in the Solar bootstrap. Then we’ll target just Symfony 2, Solar beta 3 (non-preload), and the Solar trunk preload for comparison to each other. The results on the “m1.large” instance are:

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   1.1884 |  5611.92 |  5428.74 |  5700.62 |  5654.88 |  5622.77 |  5652.61
baseline-php             |   1.0000 |  4722.21 |  4646.16 |  4698.98 |  4780.01 |  4753.59 |  4732.29
solar-1.0.0beta3         |   0.1142 |   539.31 |   542.25 |   543.45 |   538.97 |   537.07 |   534.82
solar-preload            |   0.1780 |   840.38 |   841.38 |   846.63 |   838.32 |   833.13 |   842.44
symfony-2.0.0alpha1      |   0.1384 |   653.52 |   658.03 |   658.61 |   654.93 |   649.51 |   646.50

It appears that when we treat Solar and Symfony 2 the same way, by preloading the foundation classes, we find that Solar is about 28% faster than Symfony 2 (and about 55% faster than the non-preload Solar beta 3 with no code changes at all). Perhaps it would be wise for Solar to provide a something like a preload.php file of its own as part of the system distribution.

Conclusion

  1. Fabien’s benchmark report, as shown at http://symfony-reloaded.org/fast, is inaccurate for the setup he describes. Lithium and Flow3 do not work in Fabien’s benchmark codebase at Github. Also, Symfony 2 is faster than Solar beta 3 by 5%, not 20%, on a “c1.xlarge” instance; to get a relative difference like Fabien describes, one has to use an “m1.large” instance. (It is entirely possible that the process Fabien used for benchmarking is incompletely described, and that the codebase is not fully updated, thus contributing to this disparity in results.)

  2. We should use Siege 2.69, not 2.66, for more accurate benchmarking of baseline responsiveness. If we notice that HTML is slower than PHP, it’s a sign that something is wrong.

  3. Symfony 2 preloads its foundation and application classes, something no other framework does in the benchmarked code. When we treat Solar and Symfony 2 the same way, by preloading the foundation classes for each, we find that Solar is roughly 28% faster than Symfony 2.

28 thoughts on “Running The Symfony 2 Benchmarks

  1. Pingback: abcphp.com

  2. Fabien

    Hi Paul,

    Thanks for your work. I really appreciate that you took the time to reproduce the benchmarks I have done. I’m also glad that you found roughly the same results for Solar and Symfony 2 that I found under the same conditions I setup for the test.

    Here are some more information I think are important for the reader:

    * symfony 1 also creates a “compiled” version of the base classes; so the approach is not new.

    * LIthium and Flow3 definitely work on my EC instance. I will dig further to see the difference between my setup and the code hosted on Github (now that I think about it, I should add that both uses some cache directories and the permissions might be the problem).

    * I’m more than happy that Solar could be made faster. You probably forced me to work on Symfony 2 performance more than I expected thanks to your numerous benchmarks in the recent years ;)

    * To be fair, you should note that the routing system of Solar is a bit different than the routing system of all other frameworks in the test. During my test, I found it’s not a bi-directional routing system like others (correct me if I’m wrong). And the performance hit of such a system is huge (after some talks with some framework lead developers, we found that’s the biggest performance hit for Lithium, symfony 1, and Zend 1 is the routing system for instance). That’s why I spent a lot of time on it and implemented a caching system for the routing as well.

    To conclude, I’m really happy if those benchmarks can lead to faster frameworks for the PHP community. Symfony 2 is faster thanks to your benchmarks. If I understand correctly, you might include some optimization to make Solar 1 final even faster. And Lithium will also work hard to make their version 0.7 faster. That kind of emulation (not competition) is great.

    See you at ConFoo,
    Fabien

    Reply
  3. Loïc d'Anterroches

    Thank you for this work. As you have done a lot, is it possible for you to bundle the resulting instance with the right siege etc. for other people to easily reproduce your work and possibly add other frameworks? That would be great!

    Reply
  4. EdwinF.

    Hi Paul.

    Two words: (very) good work.

    One comment: The approach of packing the core classes within a single file is not new, you can check, for instance, Prado has its “pradolite.php”, as well as yii its own “yiilite.php”. I remember one discussion Antii and I had about how relevant this is when using an opcode cache as APC or memcache.

    What would be the impact of performance of the frameworks when using an opcode cache and the core classes pack ?

    That would be very good to see.

    Thank you… See you on “Solar 1.0 final” :-D

    Reply
  5. joel

    ^^

    I too am interested in the performance impact of using a packed base file with APC compared to without.

    Great read!

    Reply
  6. Pingback: Paul Jones’ Blog: Running The Symfony 2 Benchmarks | Webs Developer

  7. Harro

    I think the bit about preloading base classes is the whole point.

    If it’s that much of a performance difference shouldn’t the framework do it by default to speed up things?

    You can make every framework fast with a lot of tweaking, the thing is that you shouldn’t have to do it to benchmark it ;-)

    I mean, adding page caching will also increase the performance, especially in a simple static hello world app, the point is how it performs out of the box without a lot of knowhow about it.

    Reply
  8. Bob Jones

    You should rather implement the core compiling feature into Solar than wasting your time trolling on performances again and again. You started a flamewar years ago and now you found some people playing your rules, your bad faith is lame to see.

    Reply
  9. Padraic Brady

    @Bob Jones,

    Quite an unfair statement. If you actually followed comments from other framework devs, you’d notice they have no problem using Pauls methodology. As for the flamewar, there’s been nothing but friendly debate between those posting most benchmarks. Where are these mysterious flames? Most of us credit Paul with drawing attention to framework performance as a positive thing.

    Reply
  10. Sam

    Comparison is a bit unfair for Yii:

    1. Data caching is not enabled. It results in parsing the URL rules for every request.

    2. Nested layout is used for Yii (column1->main) in second comparison, while other frameworks only use one-level layout.

    Reply
  11. pmjones Post author

    @Sam — I believe you, but for good or bad, it’s not my benchmark series. Contact Fabien Potencier and instruct him on how to set up Yii properly for the comparisons.

    Reply
  12. Pingback: Symfony 2.0 Preview Release testen | davidsCorner.de PHP & Web Entwicklung

  13. Pingback: Symfony 2 benchmarks: more than meets the eye? | php|architect

  14. Ron

    What about CodeIgniter? Kohana? Shoot Asp .Net MVC, Ruby on Rails and Django?

    I would love to see how they stack up performance wise!

    Reply
  15. Xeno

    Well you failed to test against PHPulse which gets benchmarks that are 5-10 times faster than all the others Solar combined. I’m constantly amazed at how everyone just leaves PHPulse out of their benchmarks when trying to show how fast their frameworks are as PHPulse sets the standard for fast PHP framework

    Reply
  16. Robert Gonzalez

    Paul,

    Thanks for another great article on benchmarks and performance ratings. I have to say that the PHP Community, and more specifically the PHP framework development community, is better of because of it.

    And kudos to the devs/reps of the other frameworks. It’s nice to see how well received this data is among other developers (Fabien, Padraic, etc).

    @Xeno: not to speak ill of PHPulse, but could it be that PHPulse is left out of benchmarks like this because of it’s implementation? Most of the frameworks in the benchmark seem to be easily layered onto most web servers, can talk to most database servers and do not necessarily provide as much front end stuff as PHPulse. It would seem to me that if I wanted to drop a framework onto my IIS installation and talk to my Sybse server that I couldn’t even use PHPulse because it requires Apache and MySQL. But ZF, Solar and many of the others would be an easy implementation to do exactly that.

    (And for the record, I have had cause to implement setups similar to that. ;) )

    Again, thanks for the cool write up Paul.

    Reply
  17. Tormi Talv

    How is it possible that a framework(Lithium) is faster that baseline-html?

    I’ve got to test it myself!

    Reply
  18. Joh

    This post says that Lithium and Flow3 need PDO. It also declares that these two frameworks do not work in Fabian’s test. I am sorry but this is no a problem of Fabian’s tests. If you do not have PDO installed, it does not means it is a problem of your tests.

    Your tests cannot be trusted. They look incompetent.

    Reply
  19. pmjones Post author

    Joh,

    I followed the instructions as posted by Fabien. He did not include PDO setup as part of those instructions. Thus, the report he shows cannot be reproduced as-is. In turn, that means the problem is with his test as he described it.

    On the other hand, my tests come with working instructions and are repeatable by anyone as-is. Try them yourself to see if they can be trusted.

    Finally, while Flow3 appeared to require PDO, the problem with Lithium appeared to be a mistaken target line. These are two different kinds of errors. If there is incompetence to be found, one can find it in your comprehension of the article.

    Reply
  20. Harold Bunderstrom

    Great set of benchmarks – must have been a lot of work. Thanks.

    I’d really like to see a head-to-head between Django, Rails, Spring and some of these PHP frameworks. I can’t find any that are up-to-date.

    Reply
  21. Josh

    Yes! Definitely would be interested in a Django/Rails/Spring comparison!

    Reply
  22. jan

    Hi!

    This is an old discussion. But did anybody think of APC regarding “compilation” of php classes? If you have APC, how much does this comlilation matter? I think nothing.

    Reply
  23. Pingback: Quora

  24. Hari K T

    There is a typo in the link . You missed .com ;)

    paul-m-jones.com/public/fabiens-benches.sh

    Thank you for the great article and your time.

    Reply
  25. Pingback: Symfony2 – retour d’expérience | Chroniques d'un webliver

  26. Pingback: Hacktard » Preloading vs lazy loading

Leave a Reply

Your email address will not be published. Required fields are marked *