Data, Energy, People & Planet
  • 24 Oct 2024
  • 5 Minutes to read
  • Dark
    Light

Data, Energy, People & Planet

  • Dark
    Light

Article summary

Some SLM Balance in the Polarized Digital Age

We need to start taking data privacy and data broking seriously because it would cut energy consumption if only companies adopted private SLMs rather than the huge open source LLMs offered by the big tech organizations. This would keep energy bills down for all of us long suffering consumers who are just about to see another big increase in our bills this winter. It's simple supply and demand on a planet of finite resources.

  • International Energy Agency (IEA): The  IEA estimates that global data center electricity use (excluding  cryptocurrency mining) increased by 60% between 2015 and 2022, reaching  240-340 terawatt-hours (TWh). While this doesn't isolate the past four  years precisely, it shows a clear upward trend even though the  stagnating Eurozone's total energy requirements went up only 4% - rather  worrying suggesting shrinking industrial and economic activity figures  are being massaged more than we realize.

  • Goldman Sachs  research: Their analysis suggests that data center power demand could  grow by another 160% by 2030, with AI being a major contributor. This  indicates a rapid acceleration in energy consumption and we know it's  not for the benefit of people and planet. As key folks in the VC and  consulting circus they put a positive spin on things.

  • Hyperscale  Data Centers: Companies like Amazon, Microsoft, Google, and Meta all  saw their combined electricity use more than double between 2017 and  2021, reaching around 72 TWh. It has all gone quiet on the western front  from 2021-23. This highlights the worrying and significant energy  demands of large-scale data centers.

Factors Driving Increased Energy Consumption:

  • Rising  demand for data: Our reliance on data-driven services, cloud computing,  and internet usage continues to grow exponentially, leading to  increased demand for data centers.  Why do we need it all, who does it  serve, and why are the huge financial losses being maintained by big  Tech?

  • Power-hungry AI workloads: AI applications,  particularly those involving machine learning and deep learning, require  substantial processing power, significantly increasing energy  consumption.  Using SLMs it would be a different kettle of fish!

  • Slowdown  in efficiency gains: While data centers have made strides in improving  energy efficiency, the pace of these gains has slowed in recent years,  contributing to higher energy usage. As for the green washing tactics  being employed, Net Zero has become a farce!

The Impact on Costs:

  • Increased  supply chain electricity bills: Data centers face rising electricity  costs due to their growing power consumption, which can translate into  higher operating expenses for companies. Renewable energy is largely  owned by the fossil fuel cartel so they price gauging and therefore  still laughing all the way to the bank.

  • Price increases  for consumers: Companies always pass on these increased costs to there  customers further down the chain and this is the same for services that  rely on data centers. But that is only the half of it because the  continued pressure on supply and the CFD (Contract For Difference) daily  strike price system is leading to deindustrialisation and increasingly  insolvent homes in Europe.

  • Carbon footprint: Despite the  greenwashing common sense tells us the increased energy consumption of  data centers is a carbon footprint to far, raising alarm bells about  their environmental impact on a planet of finite resources.

  • Water usage: Data centers require significant amounts of water for cooling, further impacting the environment.

The solution?

Taking data privacy and responsible data brokerage seriously would contribute massively to reducing energy consumption in data centers, in so doing it would reduce all this brain (brand) washing and data snooping that is allowing big business to kill small business which in turn stifles the innovation necessary to replace all the jobs that are going. The deindustrialising and dying.

Western Economies are consumer societies so we either earn money ourselves or rely on UBI which will inevitably come with strings attached. This Central Bank Digital Currency fuelled stream will mean what to buy and when to buy from the supply chain side of the cartel. They know where you are and what you are doing, so speak out and you might just get your money docked down to basic rations or worse!

  • Less data collected = Less Data  to Process: Data centers use massive amounts of energy to store,  process, and analyze data. If companies were more selective about the  data they collect and share, there would be less data to manage, leading  to lower energy demands.

  • Reduced Data Transfers: Data  brokers often transfer large volumes of data between different parties.  Minimizing these transfers through stricter privacy controls and data  minimization practices would reduce the energy required for data  transmission and storage.

  • Efficient Data Storage:  Stronger data privacy regulations could encourage the development and  adoption of more efficient data storage technologies, such as  de-identification techniques and differential privacy, which can reduce  the amount of data that needs to be stored and processed.

  • Empowering  Individuals: Giving individuals more control over their data can lead  to more conscious decisions about data sharing. This could result in  less unnecessary data collection and processing, ultimately contributing  to energy savings.

Private SLMs in the face of the Open LLM cartel

Small and specialized language models (SLMs) are emerging as a key player in the quest for independant, privacy led and YES genuinely green and energy-efficient AI. Why are SLMs Energy-Efficient:

  • Reduced  Model Size: SLMs have significantly fewer parameters than large  language models (LLMs), requiring less computational power and memory  for training and inference. This directly translates to lower energy  consumption.

  • Targeted Training: SLMs are often trained on  smaller, more specific datasets, reducing the energy needed for data  processing and model training.

  • Optimized Architectures:  Researchers are developing SLM architectures specifically designed for  efficiency, further minimizing their energy footprint.

  • Edge  Deployment: SLMs can be deployed on edge devices like smartphones or  local servers, reducing the need to rely on energy-intensive cloud data  centers.

Benefits Beyond Energy Savings:

  • Reduced Latency: SLMs can provide faster responses due to their smaller size and reduced computational demands.

  • Enhanced  Privacy: SLMs can be trained on private data and deployed locally,  minimizing data transfer and potential privacy risks.

  • Cost-Effectiveness: SLMs are generally less expensive to develop, deploy, and maintain than LLMs.

  • Improved  Accuracy: For specific tasks, SLMs can achieve comparable or even  superior accuracy to LLMs due to their focused training.

Examples of SLMs in Action:

  • Grammarly: Uses SLMs for grammar and style checking.

  • SCOTiĀ® by smartR AI is built on a suite of SLMs that can operate within the existing infrastructure of an enterprise

  • Customer service chatbots: Many companies use SLM-powered chatbots for efficient and personalized customer interactions.

  • Voice assistants: SLMs are used for tasks like speech recognition and natural language understanding in voice assistants.

  • Medical diagnosis: SLMs can assist in medical diagnosis by analyzing patient data and providing insights.

SLMs offer a promising pathway towards more energy-efficient and sustainable AI. By reducing computational demands, enabling edge deployment, and providing comparable performance for specific tasks, SLMs can help mitigate the environmental impact of AI while still delivering valuable benefits. Taking data privacy and data brokerage seriously also has the potential to significantly contribute to reducing energy consumption in data centers. By promoting responsible data practices, empowering individuals, and incentivizing energy efficiency through SLMs, we can move towards a more sustainable and privacy-conscious digital future.

Written by Neil Gentleman-Hobbs, smartR AI


Was this article helpful?

ESC

Eddy AI, facilitating knowledge discovery through conversational intelligence