I am currently investigating the behaviour of a distributed generator following an islanding event (loss-of-mains). I have an ideal source, which represents the generation from the rest of my power system; then I have a very simple distribution network with some loads and a single synchronous generator. I apply an islanding event at t = 15s. The load exceeds the generation so I expect the frequency to fall. I am measuring this using the 'm' output from the simplified synchronous machine block and comparing it with the frequency measurement from the PLL block. However, the two show different behaviour. Furthermore, following the islanding event, the power output of the synchronous machine increases. I don't understand what is going on with my model. Please could someone shed some light.