- British startup wants to bring the computer even closer by embedding micro data centers in lampposts
- Self-destructing Nvidia chips will ensure that sensitive information remains private
- Some challenges must be resolved before this becomes a reality
British startup Conflow Power Group Limited (CPG) has proposed a major overhaul of the world’s data centers by embedding microdevices directly into urban street infrastructure, such as lampposts, instead of concentrating computing in the hyperscale facilities we know today.
In addition to spreading data processing across cheaper and more manageable micro-locations, the plan also centers around local solar power generation and battery backup systems to tackle one of the biggest criticisms data center campuses face – sustainability and environmental impacts.
Under the new proposals, CPG aims to bring AI computing closer to users and devices, which will reduce latency and ease pressure on national telecommunications infrastructure.
Future data centers may be located right outside your front door
The BBC reports $2,000 Nvidia AI accelerators could be used in place of high-end and flagship GPUs like the H100 and B200 systems, which cost tens of thousands of dollars per unit.
The use of self-destructing chips is also a notable addition to the scheme, with Nvidia including firmware locking, encryption and other anti-tampering protections that can effectively disable hardware if it is compromised, moved or accessed by unauthorized methods or people.
Such anti-tampering technologies already exist for export compliance and are generally seen when AI accelerators are sold for limited or edge deployments to ensure maximum security.
Tens of thousands of microdata centers could spread computing across cities, each handling localized AI workloads that could span applications such as traffic monitoring, CCTV, autonomous vehicle coordination, telecommunications, environmental measurement and more.
While the idea of embedding data centers in lampposts may be a new idea, bringing AI to the edge is a trend that is accelerating as AI workloads become more sensitive to bandwidth and latency. Placing the computer physically closer to where it is used is also likely to reduce the costs associated with data transfers.
The energy implications are also compelling, with hyperscale data centers facing constraints from delayed grid connections and unsustainable supplies. Spreading power consumption geographically certainly solves the current bottlenecks, where some of the largest campuses consume as much electricity as small towns.
Lampposts in particular are attractive due to their existing electricity connections, their dense distribution in urban environments such as towns and cities, and the fact that many are also already connected to fiber networks.
Many have already become multifunctional for 5G small cells, traffic camera mounts, Wi-Fi access points and electric vehicle chargers.
Interestingly, there are also significant geopolitical advantages to locating microdata centers across existing street infrastructure networks. Increasing sovereign data processing concerns in Europe and the UK are prompting governments to promote schemes that support more local processing.
However, for these to work on a large scale, CPG must address the many challenges that come with embedding data processing into lampposts. Existing infrastructure may need to be upgraded to ensure protection against weather and vandalism. There are also concerns about thermal management, with large data centers slated for their excessive water consumption for cooling.
While it may not be economically viable to upgrade existing infrastructure, building multifunctional networks in the future may allow a single element, such as a lamppost, to serve far more functions than we previously imagined.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds.



