Real hydro uses damns and either builds up a high gravity, low flow, reservoir or simply backfloods a massive area to establish a small potential energy across a -lot- of area. Sticking out a paddle and hoping to get work from it is about as effective as water-milling (actually it's exactly that effective; OK for slow grinding, suck for actual power production).
Given the physics are over-simplified (and yet still lag-inducing enough) geological engineering to maximize hydro or wind power is already a lost cause. Making these generators more powerful probably isn't realistic anyway.
Solar, there's a finite quantity of energy per given square meter for any earth-like planet.
"Deserts, with very dry air and little cloud cover, receive the most sun—more than six kilowatt-hours per day per square meter. Northern climates, such as Boston, get closer to 3.6 kilowatt-hours. Sunlight varies by season as well, with some areas receiving very little sunshine in the winter. Seattle in December, for example, gets only about 0.7 kilowatt-hours per day. It should also be noted that these figures represent the maximum available solar energy that can be captured and used, but solar collectors capture only a portion of this, depending on their efficiency. For example, a one square meter solar electric panel with an efficiency of 15 percent would produce about one kilowatt-hour of electricity per day in Arizona. "
http://www.ucsusa.org/clean_en…w-solar-energy-works.html
I'm not sure how an eU relates to a kilowatt, but lets compare some effects. Macerators cost 2eU/t to run. It is effectively a giant industrial grinder/blender. Such a thing is probably 20 amps at 120v; but more often would be run with a 3 phase motor and higher voltages.
But wait, when I try to convert that I get watts/t; how does that relate to watts/hour? Minecraft has 20000 logic (10000 redstone) ticks per day. I get a ratio of each 'tick' being equal to 4.32 simulated seconds. 833 + 1/3 ticks equal an hour. 1.666KeU/'H' sounds close enough to my guesstimate. However this means that eU/t is really 'watts/second' instead of 'watts/hour'.
The ratio for watts/second to watts/hour is thus 3600:1
Let's see if that makes sense:
Coal: 5eU/t = 18KW/H
Nuke: 35eU/t = 126KW/H
Solar: 1eU/t (while active) = 3.6KW/H
Actually, yes, that's supposed to be the output per day, not per instant. Dividing the solar rate by about 10, 5 for the most effective panels, would match reality.
However the upper end of the scale is also distorted; without resorting to CARUC (AKA CASUC) methods using renewable coolants automatically applied IC2 reactors can't even begin to approach Gen I nukes.
The most efficient basic stable design produces at 35 eU/t; 7million eU over the 3 fuel cell lifetime. This should be roughly comparable to an early-mid Gen-I nuke from RL. The very first commercial fission reactors ( http://en.wikipedia.org/wiki/Nuclear_Reactor ) seem to all be in the neighborhood of producing 50-200 MW/H. Our reactors produce about 1/1000th that.
Even the crazy CARUC reactors can only come within spitting distance at 2.3MW/H (for a 640eU/t).
If solar's nerfed, nuke should be boosted further. If it's just about game mechanics, then change the way reactors behave.
Also, it would be far more realistic if generators (the burning/lava ones) and nukes produced heat that must be exchanged via chain attached windmill like turbines (possibly water wheel like). Also if nukes needed to be hooked up to an external heat sink. The chambers could be fold-ins to modify damage value and change which cells were processed. (the remaining cells would not be processed and could be filled in with random junk that would not be ejected).