The ins and outs of backend development - what’s the deal with PHP?

In my article I will discuss issues related to various versions of PHP, it’s frameworks, virtualization, quality assurance and continuous integration. This article is addressed mainly to people who are not up to date when it comes to PHP and want to know whether they should choose it as a platform for their applications. I hope that I can make you understand the current situation and possibilities of this language.

Why releases of language’s new versions are important for you?

Languages are created by programmers, who use various tools to create a code. Said code is then compiled or interpreted and ultimately can be used by end users. On the quality of these tools and the possibilities they give depends how easy it’s going to be to create your product and whether it’s guaranteed that it will work properly and safely.

Release of a new version of a language increases its capabilities as well as the security, therefore it is important for those involved in the production process.

Currently (April 2016) the latest version of PHP marked as stable is PHP 7.0.9. Compared to the previous version (PHP 5.6.24) it has more tools and achieved quite a significant performance boost. However, many people are still quite skeptical when it comes to migrating their platforms to version 7 of PHP because they fear bugs and the loss of stability.

But, (as seen on the previous chart) PHP 7 is steadily more and more used in the projects (especially the new ones), because its higher productivity means lower costs of servers.

Frameworks: Symfony, Laravel, Phalcon

Google search popularity 2000-current
Google search popularity 2000-current

Google search popularity 2000-current

Currently developers often often use ready-made elements of the application. Developers working within the PHP community seek to organize the app production and facilitate its whole process.

A tool that helps in preparation of an applications is a framework. It includes predefined classes that are making the developer’s work much easier. Framework usually contains elements such as managing events, database, cache engine, logs, ACL or user authentication. Of course you can prepare such elements yourself. However, many people work on the quality as well as the security and on the ”coverage” the code with unit tests. 

Let me briefly discuss three of the most popular and the most interesting PHP frameworks.

Symfony

Symfony is the oldest one from the three frameworks I chose to describe - it was released in 2005. Produced by Sensio Labs, which is headed by Fabien Potencier. This company (and especially its CEO) is also a major contributor to many other projects, such as Silex or PHP-CS-Fixer. A characteristic feature of this framework is that its modules are the basis for many other platforms, such as Drupal or Laravel. It has a very good documentation, a large number of bundles (a kind of ready-made modules that provide additional functionalities). Its construction is conducive to good practices in object-oriented programming.

Laravel

Laravel’s slogan is The PHP Framework For Web Artisans, and people don’t have to be especially encouraged to use it. It's hard to compare Laravel with Symfony and decide which is better - when it comes to efficiency and given possibilities both platforms are very similar to each other. Laravel uses many modules of its older brother, so for someone who previously used Symfony some elements may look very familiar. A big advantage of Laravel is the undeniable pleasure of using it - some solutions are more intuitive than in the case of Symfony, therefore development seems to be easier and faster.

Phalcon

Phalcon was created as a response to the growing demand associated with the performance of applications. It was created with the idea of being the fastest PHP framework and is ideal to work on simpler platforms/services that require higher performance. You also have to remember that the faster your code works, the less money you will spend on maintaining server infrastructure. The main reason for such a great performance of this tool is that it’s created as an extension of the language. Thanks to that certain operations are performed much faster, resulting in its increased productivity. However, we must remember that databases or external API are still fixed elements of the applications and that they often have a significant impact on the actual speed of the program.

Virtualization

Internet market is under constant development, designs also becoming more and more sophisticated. Microservices-oriented architecture is also gaining popularity - applications are built modularly to make its flow controllable and easier to understand. The use of multiple services causes the environment to becomes more complex.

Developers need to be sure during their work that every functionality works well - it’s the same when it comes to customer's production environment. When more that one person works on a project, then everyone involved in it needs to use the same versions of the programs or services. It would be difficult to obtain such a guarantee on their operating systems because they can’t predict how certain elements are going to behave in a different environment.

With help comes virtualization, a technique that allows you to simulate using the hardware resources of the other equipment or environment. These capabilities provide services such as Vagrant and Docker.

Vagrant

Vagrant is a tool that allows to create a system’s image that contains needed services. Users have the possibility to use ready-made solution that is prepared by the company or even a individual person. Large selection of those solutions greatly simplifies and speeds up the work.

By using the appropriate installation scripts this image can be expanded and appropriately configured. Ok, so what’s its advantage? We can add configuration files to the project, thanks to what each new person can prepare a working environment for their needs with the use of a single command. With virtualization implemented (for instance by Virtualbox) developers can work efficiently on files that are on their computers and that are shared on a virtual machine.

Docker

To explain how it works I should at first introduce the concept of a container. It can be seen as a collection of uniform layers and is composed of read-only container’s image and of a layer that can be both read and saved. Thanks to the layer that can read and save the container it is possible that all processes that we want to run work properly.

This description sounds pretty vague, so let's focus on advantages of using Docker. While working with it we will focus on microservices and extract separate containers for each service. However, thanks to containerization Docker doesn’t have to create several separate physical machines. By creating layers and sharing containers between virtualized resources we will receive well isolated services that are really easy to use. Docker’s main advantages are the ability to have any possible configuration as well as the speed and lightness of the entire runtime.

Docker allows you to create your own containers and configure each of them to according to your needs. We gives you the ability to use many ready-made images or auto deploy (placing containers on the server) for such services as AWS or DigitalOcean.

Like in the case vagrant, creating the runtime environment is possible with the proper use of a single command. It’s especially convenient when there are more people involved in the project. It’s perfect for example for Phalcon, which requires to use PHP with an appropriate extension installed. We can use a ready-made environment, what saves a valuable time and ensures a proper operation of the project on each computer.

Testing

The demand for advanced applications constantly grows and it’s very important that they have a perfect performance. But how can we provide it? Almost every software house has its own QA department, which impersonate users and checks for possible errors. However, this does not mean that you will end up with a mistake-free product - there is a possibility that not everything will be detected.

Different approaches

Nowadays it is a customer who dictates how the finale project will look. Those decision are made during the development process, not at the beginning of cooperation. By prioritization and influence over tasks performed by a development team you can modify many elements of the project. Such opportunities are provided by Agile methodologies and working in SCRUM technique. Changing the approach to the project and creation of the so-called User Stories (stories describing possible users’ actions and their expected results) also allows for an easy formulation of acceptance criteria for every functionality.

TDD

Test-driven development, an agile methodology, is a technique of software development that operates according to the following production cycle: at first programmer writes an automatic test for added functionality. Then it’s time for an implementation of said functionality, what will ensure that written test will work. In the last step developer will refactor his code to meet the expected standards, what can also be used to improve existing code.
The biggest advantage of TDD is that you can be sure of high standard of the code. When the test is properly performed developer can be virtually certain that a selected piece of code works properly.

BDD

BDD is a second-generation, outside–in, pull-based, multiple-stakeholder, multiple-scale, high-automation, agile methodology. It describes a cycle of interactions with well-defined outputs, resulting in the delivery of working, tested software that matters.

[D. North]

BDD approaches to the problem of testing by using and expanding the possibilities of TDD. It specifies requirements in a clear and understandable way and as accurately as possible. As in the case of TDD it carries out consecutive cycles with a minimal necessary effort. This enables us to take care of the speed while the quality is also maintained thanks to the phase of code refactoring. Specifications of functionalities and tests is provided as follows

Feature: Listing command
In order to change the structure of the folder I am currently in
As a UNIX user
I need to be able see the currently available files and folders there
Scenario: Listing two files in a directory
Given I am in a directory “test”
And I have a file named “foo”
And I have a file named “bar”
When I run “ls”
Then I should get:
“”"
bar
foo
“”"

This way of formulating the requirements is an excellent way to write the test. With tools such as PHP’s Behat we can make the most of the potential of this approach.

Types of tests

Tests can be divided into several groups. I might slightly generalize, but when we test a particular part of code, then we use unit tests. But when we want to test a particular functionality, for instant logging in, the we use functional tests. We can also test the application completely from the outside - then we are running for instance UI tests.

But what are those tests for? Usually a change in one place changes the whole operation of another piece of code. By making small modifications during work on some functionality we have no guarantee that it won’t cause a problem elsewhere. Running test after every minor change in the code ensures that everything works the way it should.

Unit tests

Unit testing is the fastest way of quality assurance and should be used as often as possible in your code. In the TDD methodology it’s a natural part of creating the code. By testing small pieces of code we ensure "covering", what in turn translates into greater guarantee that the whole code will properly work. During unit testing we test a specific function or class.

Functional tests

Functional testing allow you to make sure that the functionality works from the perspective of a user. A good example of it is login via API. After sending data via the form user should receive the response that varies depending whether the given information is correct. We test not only the success of the operation, but whether said response reacts properly to incorrect data.

Regression tests

Developers have to simulate a mistake when someone points it out. Only in this way they can prove that it has been repaired. That's what regression tests are for. They are prepared before reparation of any element. At first developers have to write a test in accordance with TDD. The result of the test isn’t correct, so they need to confirm that there is an error and proceed with its repair. In the next steps they will repair it to give confirmation of this in the test and then refactor the code.

Continuous integration

Continuous integration and deployment usually mean using the tools to ensure that the code that’s on the server is up to date and works properly. Those tools allow us to divide the whole process of changing a code into particular steps. We can test our code on multiple machines before the code is placed in the right environment. Such techniques help to ensure that the code ( eg. the one that is placed on the test server) is always up to date. We can achieve it through plugging it into a code repository (such as Github or Bitbucket), to which developers are sending the changes they made.

We can also enforce appropriate standards and guarantee that only code that has passed all the tests will be placed on the server. Among the best tools for this purpose are PHPCI, Jenkins, Travis CI or Bamboo.

What can you gain by using modern frameworks?

Nowadays market is constantly flooded with new approaches and tools, which allow you to create projects in a faster, more professional way that ensures safe and stable operation. PHP language is getting more and more advanced, but in my opinion it’s the number of various technologies and methodologies that is the most important when it comes to software development. With Docker we can simplify the configuration of environments and significantly save our time. We can gain a lot by making good use of modern frameworks or continuous integration. Nowaday the subject of safety is frequently raised, therefore testing is getting more and more important.
I hope that I at least partially managed to introduce you to the currently PHP environment and that I made at least some of the concepts associated with it more understandable.
 

Navigate the changing IT landscape

Some highlighted content that we want to draw attention to to link to our other resources. It usually contains a link .