The second annual PHP TestFest is currently underway and this weekend the PHP NorthWest user group met to make their contribution with Ben, Rowan and myself representing Plusnet. The aim is to improve the amount of PHP source code that is covered by tests thereby giving developers increased confidence that unintentional behavior changes are not made. These tests are run frequency and the results are published on gcov.php.net.
Having used PHP for a number of years to build a career and pay the bills, it feels particularly satisfying to contribute something back to the project. Also, those people making significant contribution may apply for a $email@example.com email address which looks cool on a business card!
With the recent release of PHP5.3RC2, the main focus is on improving code coverage for the 5.3 branch. PHP core developer Scott MacVicar started the day with an introduction on how-to compile from source as well as run the existing tests. After pizza, we got down to the business of writing tests, starting with a simple example which I'll run through...
First, download the latest 5.3 snapshot from snaps.php.net and extract it. NorthWestUG were only focusing on testing the Standard PHP Library (SPL) so we disabled all the extensions to speed up the compilation time. If you're interested in other extensions, use the appropriate switch to enable it - see './configure --help | less' for details. Note: you may need to install 'build-essentials', 'ltp' and 'gcov' packages before performing the following step.
tar zxvf php5.3*
./configure --disable-all --enable-gcov && make
Once complete, you should now have your compiled PHP CLI binary in sapi/cli/php. Lets test it,
$ sapi/cli/php -v
PHP 5.3.0RC2 (cli) (built: May 9 2009 19:48:19) (GCOV)
Copyright (c) 1997-2009 The PHP Group
Zend Engine v2.3.0, Copyright (c) 1998-2009 Zend Technologies
To run all the existing tests, approximately 5700, run 'make test'. To generate a code coverage report run 'make lcov' and open lcov_html/index.html to see the results. Both these commands also accept a 'TESTS=' argument which is useful for testing individual directories and files when writing new tests. For example, to only run the SPL tests, use 'make test TESTS=ext/spl/tests/'. Now lets create our own test. For this example, we'll check that PHP can do basic addition. PHP test files have a .phpt extension and consist of several sections. Every test must contain TEST, FILE and EXPECT sections.
TEST - a meaning description of the test
FILE - your test code
EXPECT - what the expected outcome of FILE is
There are many other sections however, CREDITS is common and completing this will get your name/email address into CVS. Refer to the reference manual and QA PHP site for guidance. Here is the example test that Scott used (the new line at the bottom of the test is important!),
Check that PHP can count
Your name/email address
echo 10 + 10;
Save the file using the .phpt extension and run the test, 'make test TESTS=/path/to/test/filename.phpt'. If for some reason the test fails a number of output files are created in the same directory as the test. The most useful is the .diff file, this records the test's actual output against what was expected. Assuming the test works as expected you get something like,
PHP : /path/php-5.3.0RC2/sapi/cli/php
PHP_SAPI : cli
PHP_VERSION : 5.3.0RC2
PHP_OS : Linux - Linux work-laptop 2.6.28-11-generic #42-Ubuntu SMP Fri Apr 17 01:57:59 UTC 2009 i686
INI actual : /path/php-5.3.0RC2/tmp-php.ini
More .INIs :
CWD : /path/php-5.3.0RC2
Extra dirs :
VALGRIND : Not used
Running selected tests.
PASS Check that PHP can count [/path/to/test/filename.phpt]
Number of tests : 1 1
Tests skipped : 0 ( 0.0%) --------
Tests warned : 0 ( 0.0%) ( 0.0%)
Tests failed : 0 ( 0.0%) ( 0.0%)
Expected fail : 0 ( 0.0%) ( 0.0%)
Tests passed : 1 (100.0%) (100.0%)
Time taken : 0 seconds
Now we know how-to write basic tests, we need to find some real code to test. PHP's gcov site reports all untested code in red, so just pick a section and write a test for it. Once the test passed, re-run the code coverage report to ensure the expect line(s) is now green. When you're happy with the test follow the what to do next instructions and submit it to the QA mailing list for review. It's preferable to submit tests in batches rather then by drip-feeding them as the tests will be reviewed by the QA team, run on multiple OSes, hardware as well as committing to the 5.2, 5.3 and HEAD.
Thanks to iBuildings for sponsoring the event, Salford University for use of their facilitiess, Lorna Mitchell for organising the event, Scott for his mentoring and patience and to everyone that turned up and wrote some tests. Currently NorthWestUG is in the lead with 95 tests! All the tests created by the NorthWestUG can be found here.
Credit to Lorna Mitchell for the photos. The full set is available here.