We will present an analysis of generative diffusion models using methods and concepts developed in statistical physics. We will focus on regimes in which the dimension and the number of data are large, and the score function has been trained optimally. We will characterise two main phenomena emerging during the backward generative diffusion process. The generative dynamics, starting from pure noise, encounters first a ‘speciation’ transition where the gross structure of data is unraveled, through a mechanism similar to symmetry breaking in phase transitions. The second phenomenon is the generalisation/memorisation transition. Depending on the number of data, the generative dynamics can display a ‘collapse’ where the backward trajectories become attracted to one of the memorized data points. For realistic dataset, the speciation time can be found from a spectral analysis of the correlation matrix, and the collapse time can be found from the estimation of an ‘excess entropy’ in the data. Analytical solutions for simple models like high-dimensional Gaussian mixtures substantiate these findings and provide a theoretical framework, while extensions to more complex scenarios and numerical validations with real datasets confirm the theoretical predictions.