There are over 2 billion smartphones in use worldwide, with an average of over 60 apps installed on each device. When an app is running, it uses the phone’s CPU, antennas, etc, all of which require power to operate. This begs the question: How much power is being wasted due to mistakes made by developers that cause apps to use more battery power than they should?

We stumbled upon this issue ourselves while developing our own Android app. Due to an error in the code, the Windscribe Android app ended up running in the background even while disconnected, making pointless API calls to our backend, and preventing Android from shutting down the app to save battery life. This resulted in about 15% of battery power being completely wasted. Totally our bad, and we’re really sorry about this. Yesterday’s update (1.14) fixed the issue, but let’s do some napkin math and put 15% into perspective.

Before the issue was fixed, there were about 100,000 DAU (daily active users) using the app. The average phone’s battery has about 6 watt hours of power. If you multiply 6 watts x 100,000 users x 15% wastage x 30 days, you get 2700 kWh of power that’s being wasted in a 30 day period. According to the EPA, each kWh of power has an emission factor of 0.000744 metrics tons of CO2. This translates to 24 tons of carbon per year. All from a single app, and there are millions of apps out there.

Granted most apps PROBABLY don’t waste 15% of your battery power, but even tiny inefficiencies get compounded given the amount of apps installed on an average device.