In an attempt to make this thread a little less vague:
There are two types of inverters, one is used for off-grid, turning battery DC into local grid AC. The other is the topic of this thread, a grid-tie inverter that takes DC (from PV, wind, or hydro) and turns that into grid AC. The two may seem similar, but have a very fundamental difference.
The first, the off-grid inverter that works off batteries, creates the Voltage and frequency, it controls it. It tries very hard to keep the grid Voltage that it produces constant, much like a battery would. It functions as a Voltage source.
The second, the grid-tie inverter, has its micro-controller set up to follow the grid Voltage and frequency. It does not make it, it merely follows what's there. It tries very hard to keep its output current constant (as dictated by the MPPT algorithm in there, trying to get the most out of the DC source). As such it doesn't 'care' what the Voltage is (within limits), and if it needs to raise its output Voltage a little to keep that current flowing it will do so.
The grid-tie inverter normally raises the grid Voltage (to be able to pump that current into the grid). Usually just a little, sometimes a lot; I've seen sites where pushing 10kW into the grid raised it by 15 Volt. It all depends on the resistance/impedance that inverter 'sees' looking into the grid.
Where that current (and power) goes is of no concern to the inverter; much is likely to be used locally, since that is simply the lowest resistance path. How it's distributed between your house, the neighbours, or the rest of the grid is entirely a matter of Ohm's and Kirchoff's laws (much like several water pumps supplying a set of hoses; where the water goes depends on the resistance to flow in each of the directions).
The hardware for off-grid (Voltage source) and grid-tie (current source) inverters is pretty much the same. The difference is almost entirely in the programming of the micro-controller that runs the show.