In large scale systems of embodied agents, such as robot swarms, the ability to flock is essential in many tasks. However, the conditions necessary to artificially evolve self-organised flocking behaviours remain unknown. In this paper, we study and demonstrate how evolutionary techniques can be used to synthesise flocking behaviours, in particular, how fitness functions should be designed to evolve high-performing controllers. We start by considering Reynolds' seminal work on flocking, the boids model, and design three components of a fitness function that are directly based on his three local rules to enforce local separation, cohesion and alignment. Results show that embedding Reynolds' rules in the fitness function can lead to the successful evolution of flocking behaviours. However, only local, fragmented flocking behaviours tend to evolve when fitness scores are based on the individuals' conformity to Reynolds' rules. We therefore modify the components of the fitness function so that they consider the entire group of agents simultaneously, and find that the resulting behaviours lead to global flocking. Furthermore, the results show that alignment need not be explicitly rewarded to successfully evolve flocking. Our study thus represents a significant step towards the use of evolutionary techniques to synthesise collective behaviours for tasks in which embodied agents need to move as a single, cohesive group.