Batteries are at the core of modern life. Sure, theyâ€™ve been a big deal for a long time â€“ the Energizer bunny is iconic for a good reason â€“ but now more than ever batteries dictate our lives. They power our laptops, watches, and phones, and theyâ€™re starting to power our cars and homes. It wouldnâ€™t be a wild idea to say that battery technology is the next big jump weâ€™re waiting for. Advancements in smartphones and computers are slowing down, and battery weight and size are holding electric cars back. But when it comes to those small devices, one of the biggest problems is also one of the oldest, and machine learning might be coming to the rescue.
With hardware and batteries, thereâ€™s a certain impetus to improve things. All the bezel-less phones weâ€™re seeing this year look downright magical, and theyâ€™re often sporting phones that are brighter and more efficient than ever. Itâ€™s easy to sell that stuff, and itâ€™s easy to justify buying it, assuming you donâ€™t have to sell (your) organs to afford one of those phones. But one of the biggest problems so far has been one of the hardest to solve.
Over the last 20 years, the code that runs our favorite applications has become immensely more complex and is incredibly difficult to optimize. When people were developing Atari games and the like, they had so little space to use and code to run in that space that optimization was a necessity. Now we have the processing power that makes that unnecessary, but itâ€™s still a huge drag for our batteries.