Migrate to Jenkins pipeline for my PHP Symfony project

Yes, we are in 2022 and I'm still using Jenkins and not the hype CI like Gitlab or Github action. Why? Because Jenkins do the job and do it well, also I like to self host my tools and  it seems easier for me to host Jenkins than Gitlab.
I have just migrated all my old jobs definition from Phing (it's like Ant but written in PHP) to declarative pipeline and curated tools I need for my Symfony projects.

I have 2 jobs by project, one with all the checks on every branches pushed on the repository and another running only on "stable" branch to deploy the code in production with minimal checks.

A classical declarative pipeline is written in a file named Jenkinsfile. And have this kind of structure:

pipeline {
    agent any
 
    stages {
 
    }
    post {
 
    }
}
 


Depending on the project, I'm using 4/5 stages. The first one build the project, nothing fancy, I just run composer, add some requirements for the job and clean some directories.

stage('Build') {
    steps {
        parallel(
            composer: {
                sh 'composer install --prefer-dist --optimize-autoloader'
                sh 'composer require --dev phpmetrics/phpmetrics friendsofphp/php-cs-fixer --no-interaction --prefer-dist --optimize-autoloader'
            },
            'prepare-dir': {
                sh 'rm -Rf ./build/'
                sh 'mkdir -p ./build/coverage'
                sh 'mkdir -p ./build/logs'
                sh 'mkdir -p ./build/phpmetrics'
            }
        )
    }
}


The second stage is used to run tests with PHPUnit and/or Behat depending on the project, but also lint the code.

stage('Linter & Test') {
    steps {
        parallel(
            'cache-clear prod': {
                sh 'APP_ENV=prod ./bin/console cache:clear'
            },
            'php-lint': {
                sh 'php -l src/'
            },
            'symfony-container': {
                sh './bin/console lint:container'
            },
            'symfony-yaml': {
                sh './bin/console lint:yaml config/ src/ --parse-tags'
            },
            'doctrine-mapping': {
                sh './bin/console doctrine:schema:validate --skip-sync'
            },
            phpunit: {
                sh 'vendor/bin/phpunit --configuration ./phpunit.xml --log-junit ./build/logs/phpunit.junit.xml --coverage-html ./build/coverage --coverage-cobertura ./build/logs/coverage.corbutera.xml'
            },
            /*behat: {},*/
            failFast: true
        )
    }
}

Last option "failFast: true" allow to save time when something goes wrong.

The 3rd stage is used to analyze the code. It's great to have unit/functional tests but static analysis give also a good feedback on the quality of the code. I'll not debate on the tools here, those are the ones I found usefull.

stage('Analyze') {
    steps {
        parallel(
            phpstan: {
                sh 'vendor/bin/phpstan --configuration=./phpstan.neon --no-progress --error-format=checkstyle > ./build/logs/phpstan.xml'
            },
            phpmetrics: {
                sh 'vendor/bin/phpmetrics --report-html=./build/phpmetrics/ --junit=./build/logs/phpunit.junit.xml --report-violations=./build/logs/phpmetrics.violations.xml --quiet ./src/'
            },
            'php-cs-fixer': {
                /* IMPORTANT: use returnStatus to catch failed exit code and don't mark build FAILED */
                sh returnStatus: true, script: 'vendor/bin/php-cs-fixer fix --dry-run --format=checkstyle --config=.php-cs-fixer.php --using-cache=no > ./build/logs/checkstyle.xml'
            },
            'cpd-back': {
                /* IMPORTANT: use returnStatus to catch failed exit code and don't mark build FAILED */
                sh returnStatus: true, script: 'phpcpd.phar --exclude=Test --exclude=vendor --log-pmd=./build/logs/cpd.back.xml src/'
            },
            'pmd': {
                sh 'phpmd.phar src/ text codesize,cleancode,controversial,design --ignore-errors-on-exit --ignore-violations-on-exit --reportfile ./build/logs/pmd.xml'
            }
        )
    }
}

Note the comment on php-cs-fixer and cpd-back steps. I use these tools to get feedback on the code, not to ensure strict validation. So I don't want the build to failed if I did something bad in the code.

Then comes the security stage, I'm using snyk and symfony to check known CVE in dependencies. they are almost redundant, but not always. And snyk can be use to check docker file, javascript dependancies and other languages.

stage('Security') {
    environment {
        SNYK_API_TOKEN = credentials('snyk')
    }
    steps {
        parallel(
            'snyk-back': {
                sh 'SNYK_TOKEN=$SNYK_API_TOKEN snyk test --file=composer.lock'
            },
            symfony: {
                sh 'symfony local:check:security'
            }
        )
    }
}

I declared snyk api token in Jenkins global credentials management tool, so I need to retrieve it. It's what is done in the "environnement{}".

Once everything have run, it's time to collect data and build report and graph. It's the main reason I'm using Jenkins. It's really easy to have feedback. I'm using "post" step to achieve that, this step is running at the end and have an option to always run even if the build failed.
Thanks to Jenkins plugin, I can publish HTML report (phpmetrics, phpunit coverage) but also parse xml files with different format. This is the trickiest part, I spent lot of time to find the good parser/format.

always {
    publishHTML([allowMissing: false, alwaysLinkToLastBuild: false, keepAll: false, reportDir: './build/coverage/', reportFiles: 'index.html', reportName: 'Rapport couverture de code', reportTitles: ''])
    publishHTML([allowMissing: false, alwaysLinkToLastBuild: false, keepAll: false, reportDir: './build/phpmetrics/', reportFiles: 'index.html', reportName: 'Rapport phpmetrics', reportTitles: ''])
    junit skipPublishingChecks: true, testResults: '**/build/logs/phpunit.junit.xml'
    cobertura autoUpdateHealth: false, autoUpdateStability: false, coberturaReportFile: '**/build/logs/coverage.corbutera.xml', conditionalCoverageTargets: '70, 0, 0', failUnhealthy: false, failUnstable: false, lineCoverageTargets: '80, 0, 0', maxNumberOfBuilds: 0, methodCoverageTargets: '80, 0, 0', onlyStable: false, sourceEncoding: 'ASCII', zoomCoverageChart: false
    recordIssues enabledForFailure: true, tools: [phpStan(id: 'phpstan', name: 'PHPStan', pattern: '**/build/logs/phpstan.xml', reportEncoding: 'UTF-8')]
    recordIssues enabledForFailure: true, tools: [junitParser(id: 'phpunit', name: 'PHP Unit', pattern: '**/build/logs/phpunit.junit.xml', reportEncoding: 'UTF-8')]
    recordIssues enabledForFailure: true, tools: [checkStyle(id: 'checkstyle', name: 'PHP CS Fixer', pattern: '**/build/logs/checkstyle.xml', reportEncoding: 'UTF-8')]
    recordIssues enabledForFailure: true, tools: [cpd(id: 'cpdBack', name: 'CPD back', pattern: '**/build/logs/cpd.back.xml', reportEncoding: 'UTF-8')]
    recordIssues enabledForFailure: true, tools: [pmdParser(id: 'pmd', name: 'PMD', pattern: '**/build/logs/pmd.xml', reportEncoding: 'UTF-8')]
    recordIssues enabledForFailure: true, tools: [pmdParser(id: 'pmd-phpmetrics', name: 'PMD PhpMetrics', pattern: '**/build/logs/phpmetrics.violations.xml', reportEncoding: 'UTF-8')]
    script {
        currentBuild.result = currentBuild.result ?: 'SUCCESS'
        notifyBitbucket()
    }
}


At the end, a Jenkins pipeline page looks like this. I can access all the report I have generated in the left section. I have build history on the middle and time series graph on the right to help me see evolution over time.

Jenkins job dashboard


I actually miss 2 things in my build. Management of frontend code with build, tests (maybe one day), security. The other thing is usage of Docker to have matrix build on futur version of stack element (php, mysql, es, redis....) to ease upgrade path.
But for the moment, I'm really happy with this, thanks to "parallel" command in different stage, it's easy to have multiple steps running in parallel and I saved a lot of time.

Add a comment