Here is an example of green screen compositing I did as a test with some footage shot on the stage at the London Film School. It uses a moving camera, a forward dolly track on a fixed lens, and so is a bit more involved than a lot of the green-screen examples around on the internet. Below is the original footage.
I tracked the camera using the Nuke CameraTracker node, which gives a 3D scene in which I could place billboards for the background and a tree to give some depth/parallax. I also included a snow particle system to give the background a bit more life. The biggest difficulty in tracking the camera was that given we were on a practice set, which was not the most sturdy construction, there is actually movement in the walls when actor gets up which throws off the calculation of the camera. I needed to be quite thorough in discarding bad tracks and screening off areas of movement to get a good camera from the sequence. There were some markers on the screen itself, but in the end these caused more problems than they helped, and were difficult to matte out of the video.
The green-screen matte itself was also quite difficult to extract, mainly because the lighting was a bit inconsistent across the backdrop (which was a bit too close to the set due to space restrictions). Also, the camera (Arri Alexa) was set up at 200 ISO at the tutors instruction – I think this was a mistake (in my experience the Alexa works best at 800 ISO where it delivers good dynamic range both above and below the exposure point), the footage was quite grainy which led to some patchiness in the generated matte. In the end I used a combination of a couple of different techniques to extract the matte, along with some roto to clear up the patchiness on the wall.
Below is the Nuke graph for the composite, and the final composite result.