The evolution of electric power generation

Electricity generation is universally recognized as the process of generating electric power from sources of primary energy.

In 1831 British physicist and chemist Michael Faraday, who is widely acknowledged as the ‘father of electricity’, was the first to discover electromagnetic induction, that the movement of a wire through a stationary magnetic field could induce an electrical current in the wire. This led him to create the first electric generator[1], the principles of which are still used for virtually all electric power generation today, no matter the prime source of energy used, be it coal, oil, gas, hydro or nuclear.

Nevertheless, the world’s first power plants used either hydro power or coal. In 1882, American inventor Thomas Edison realized his dream of generating wide scale electricity with the creation of the first central power station at Pearl Street in Manhattan, which was powered by coal fired steam engines and used direct current (DC) to supply public lighting in New York[2].

The advent of alternating current (AC), which unlike DC at the time, allowed for the conversion of electric power to different voltages, made the use of central power stations more commercially viable and accelerated advances in both hydro and coal generated power. The evolution of coal power in particular, was fast due to ever increasing demand and a growing and dynamic mining sector[3].

Next came gas but gas generated power did not really take off until the early 20th century when “innovations in aircraft technology, and engineering and manufacturing advancements during both World Wars propelled gas power technology to new heights”[4].

A decade later, with the inauguration of the Experimental Breeder Reactor I in 1951, in Idaho, the world welcomed the first nuclear reactor to produce electricity from nuclear power[5]. In the 60s and 70’s many more nuclear reactors were built, not least by France, which today still holds the record for the country with the highest amount of electricity produced from nuclear energy, around 70%[6].

However, despite producing emission free electricity and boasting a low mining and transportation footprint, nuclear energy’s popularity has waned due in part to accidents such as Chernobyl in the Ukraine in 1986 or the nuclear disaster at Fukushima Daiichi in Japan 2011, following an earthquake and tsunami.

Accidents such as these have increased calls to replace nuclear with safer and cleaner sources energy. But many believe that nuclear continues to have an important role to play in the transition to a low carbon future and in meeting the United Nations Sustainable Development goals and the ones set out in the Paris Agreement. In fact, according to the International Energy Authority (IEA), between 1970-2013 of the 163 Gt of CO2 emissions which were avoided through the use of low-carbon energy, 41% can be attributed to nuclear power and only 6% to solar and wind[7].

The use of solar, wind and other renewable energy is nevertheless on the rise. In addition to avoiding or significantly reducing CO2 emissions, renewable energy sources are seen as providing enhanced energy security, with a lower risk of fuel spills, greater price stability and greater potential for job creation[8]. Renewable energy however, is not a new invention. Humans have been harnessing the power of the sun, wind and water for millennia but modern renewable energy generation, starting with hydropower in the late 19th century, followed by wind and solar is constantly being enhanced and improved leading to greater, more stable power output, thanks in a large part to advances in battery storage, and making renewable energy more cost competitive than ever before.