When you think about the devices and gadgets that anchor our daily lives, rechargeable batteries serve as the power source for nearly all of them. While the array of devices – and the new features available to them – have expanded, batteries are increasingly falling short in terms of capacity, charging time and lifespan.
Even before the rise of today’s silicon batteries for AI-enabled smartphones, EVs, and eVTOLs, it took the industry more than a century to refine rechargeable batteries. In this first installation of a two-part series, we will explore the history of the rechargeable battery from its beginning in the 1800s, to how it has transcended the possibilities of modern technology and energy storage for all applications.
Rechargeable batteries in the age of steam trains and pocket watches
The earliest version of a rechargeable battery was the lead-acid battery, which was invented in 1859 by French physicist Gaston Planté. The lead-acid battery emerged during a time when horse carriages and steam trains were still the primary modes of transportation, and it served as a great power source for the lights on railroad carriages while stopped at stations. Excellent at delivering short, strong bursts of energy, the lead-acid battery still is widely in use today in gas-powered and electric vehicles, cell towers, data centers and more. Lead-acid is a particularly appealing option in emerging markets, where many governments seek a cost-effective option to electrify mobility.
While it has withstood the test of time, the technology also comes with a host of disadvantages. One of the primary reasons the industry has attempted to abandon lead-acid is the risk of mishandling its hazardous chemical components, which include toxic lead and sulfuric acid, leading to chemical burns and lead poisoning. Two other reasons that lead-acid has lost its shine today are its limitations in size and charging time. A lead-acid battery tends to take a larger and bulkier form incompatible with today’s personal electronic devices, and when it comes to capacity, only about 30 to 50 percent of its energy is usable. You might be waiting a while to charge as well – the last 20 percent of its capacity can take up to 80 percent of the charging time, and unplugging before it’s fully charged also prematurely ages the cell.
Needless to say, it’s only a matter of time before lead-acid batteries are replaced by batteries better suited for today’s innovations, even if the batteries that followed lead-acid also faced challenges.
Powering the birth of consumer electronics
Four decades after Gaston Planté, Swedish inventor and engineer Waldemar Jungner invented the nickel-cadmium battery (NiCD) in 1899. By the 1960s, nickel-cadmium batteries had been commercialized in Japan by companies including Panasonic to enable the electrification of everyday items such as razors and toothbrushes, before eventually powering generational icons like the Walkman. Today, NiCD batteries are often used for portable, high-power applications such as power tools, emergency lighting and backup systems, and other industrial use cases.
While versatile enough to keep pace with the new technology emerging around this time, NiCD batteries faced many similar shortcomings as lead-acid batteries. Where lead-acid had issues when discharging more than a portion of its total capacity, NiCD suffered heavily from “memory effect” – a degrading battery capacity over time. This gradual depletion of energy over time (even when unused) compromises its performance in devices. Not to mention, NiCD costs significantly more than lead-acid due to its manufacturing process, and it has since been banned in many regions today given cadmium’s high toxicity.
NiCD batteries did the trick at the time for everyday devices that could be replaced easily but could not sustain the energy required for the next big leap in electric applications.
The dawn of mass-market hybrid and electric vehicles
Soon after nickel-cadmium came nickel-metal hydride batteries (NiMH), whose development was sponsored by two major European automakers Daimler-Benz and Volkswagen AG. After entering the market in 1989, NiMH began to displace NiCd as the go-to battery for portable electronics. In 1997, NiMH was chosen by Toyota as the battery for its new Prius model, the world’s first mass-produced hybrid EV. By 2008, NiMH batteries powered more than two million hybrid cars globally.
Compared to earlier chemistries, NiMH batteries stored twice as much energy density as lead-acid and three times as much as conventional NiCd batteries. However, its energy density was still no match for the technologies to come – including lithium-ion (Li-ion) and silicon – taking up more space within compact applications and still prone to self-discharging, depleting battery power even when not in use.
Charging toward the future
Since 1859, each evolution of the rechargeable battery delivered notable improvements for the cathode, cell design and active materials. But that was when our phones plugged into wall outlets and had mechanical, rotary dials. They didn’t enable AI-driven features for edge computing that were operating via the cloud, powered by massive data centers around the world. We didn’t have the caliber of technology we do today, so we had little reason to question the amount of improvements we did make.
Before the turn of the 20th century, we had no idea how reliant we’d become on batteries. Now, with the ubiquity of high-performing consumer devices like smartwatches, smartphones and tablets, EVs and beyond, most people require reliable, rechargeable batteries to navigate their daily lives.
Now that we’ve explored the origins of the rechargeable battery, stay tuned for Part II of our series to learn more about the two chemistries prevalent today: lithium-ion and silicon.