Long-time readers will recall that I am interested in performance benchmarks as a tool to help discover the outer limits of framework responsiveness. See the blog category along with the most recent measurement report and the GitHub repo for replicating the results on your own. Benchmarking is useful because it helps you decide if it makes more sense to work on improving your application, improving your framework, or improving your… Read More »
My Benchmarking talk has been accepted for OSCON 2011. Looking forward to updating it for the occasion. Woohoo!
Here are the slides from my PHPBenelux 2011 talk about benchmarking. This presentation includes updates to previous benchmarks; the graph is on slide 40. The benchmarking project has moved from Google Code to Github at https://github.com/pmjones/php-framework-benchmarks. Framework and Application Benchmarking View more presentations from pmjones88.
As I noted last week, I have moved my framework benchmarking project to GitHub. As part of the move, I updated the project to allow benchmarking using any of three tools: Acme http_load, Apache ab, or Joedog siege. (For reference, the old project will remain at GoogleCode.) I thought it might be interesting to see what each of them reports for the baseline “index.html” and “index.php” cases on the new… Read More »
As part of “trying new things,” I have moved my web frameworks benchmark project over to Git on Github and away from Subversion on Google Code. This project is often imitated and occasionally adopted. For all you framework fans who want to compare their preferred systems to the ones officially included in the project, you can now fork the repo and add your favorite. Who knows, some may make their… Read More »
Fabien Potencier released Symfony 2.0.0alpha1 last week, along with some benchmarks showing its performance. I am glad to see that Fabien used my benchmarking system and methodology, and am happy to see that he is paying attention to the performance of his framework. I take this as an acceptance on his part that my methodology is legitimate and valid, and that it has value when comparing framework responsiveness. However, in… Read More »
My regular readers (and perhaps the irregular ones as well know that I have been obsessed with baseline-responsiveness benchmarking of frameworks for years now. The idea has always been that, in order to know how far you can optimize your framework-based applications, you need to know the limits imposed by the framework itself. Only then can you have an idea of where to spend your limited resources on improvement. For… Read More »
As many of you know, I maintain a series of web framework benchmarks. The project codebase is here and the most recent report is here. It was with some interest, then, that I viewed Rasmus Lerdorf’s slides on the subject of performance benchmarking. I’m beginning to think there’s something unexpected or unexamined in his testing methodology.