To increase the visual fidelity of my planet, I decided to improve the lighting. The lighting was initially based on the spherical normal to distinguish the planet more. However it made the land look flat and unrealistic.
The problem I had was that I didn't know where the next vertex would be as the vertices are created by the tessellation stage. However I realised that what I did have was the neighbouring fBm value.
To calculate the vertex normal, I had to create two offset points that would be on the surface of the sphere. This mean I had to create the points at the same time as the first vertex and then similarly transform the points, as after being transformed, I couldn't know where the neighbouring vertices will be. After positioning the two points next to the main vertex on the sphere, I ran them through the fBm function to find their value and then displaced them accordingly. Finally, I calculate the cross product between the vectors to get the vertex normal, and voilà, improved lighting.
Unfortunately, the lighting does highlight the issues with my merging technology.
I could alleviate this however by moving the lighting into the pixel shader. However this isn't a simple task as I'd need to set up another constant buffer (to avoid passing an unchanging 4x4 matrix into my pixel shader) as well as making sure the memory is aligned in my application.