Multi-year datacentre planning is more constrained than ever, with equipment and construction lead times lengthening against a background of rising requirements around storage, power and compute. With hyperscalers sitting on resources in a global game of musical chairs, how can players plan to get ahead before the music stops?
Jinender Jain, UK and Ireland sales head at IT consultancy Tech Mahindra, notes that power, cooling and space parameters should be assessed as a whole – and says it’s a “no-brainer” to adjust datacentre capacity based on business needs, demands and dynamics and allow for spare capacity that can quickly come on-stream.
However, many datacentres designed for 200W per square foot are still operating at half that wattage or less, with rack power effectively stranded. “As any datacentre manager knows, capacity planning is as much art as it is science,” says Jain.
There is little chance of the return on investment on material costs, as Uptime Institute’s2022 Outage analysis highlights that power and networking issues still cause multiple outages globally, with nearly 30% of major public outages in 2021 lasting 24 hours, compared with 8% in 2017.
“Operators still struggle to meet high standards that customers expect and service-level agreements demand – despite improving technologies and strong resiliency and downtime prevention investments,” says Andy Lawrence, Uptime Institute Intelligence executive director.
Steve Wright, chief operating officer at colocation and cloud provider 4D Data Centres, says concerns and risk factors should link to any multi-year plan – from data sovereignty, skills and systems, quantity and type of cloud or datacentre environment required.
Newer cloud-first deployments can require big data analytics or artificial intelligence (AI) testing, not least because multi-MW ramp-ups can cause “astronomical” cost blow-outs. Those that deal in “bigger” data may also need to replace 1,000 servers every two or three years. Some customers might be shrinking and others growing – and it is easier to expand than shrink capacity.
Yet for many customers, beyond about 12 months out or with a budget cycle ahead, things are “quite fluffy”, says Wright. “Six months before inception, they then say ‘we need to get this nailed down’,” he adds.
4D plans 15-20 years ahead around the lifespans of mechanical and electrical equipment in its own datacentres, matching requirements against the age and state of a location. The right size of land is needed, ideally near a high-voltage connection point, with capacity available, with dense fibre connectivity and access to a suitable workforce, with flexibility “designed in” to accommodate technological change, says Wright.
“With our Gatwick facility we thought about high-density cooling, tweaking the cooling system to enable that to happen,” he says. “Last year, we deployed immersion cooling for a customer; the year before we went with high-density, rear-door cooling on racks to support high-performance computing-type environments where a standard 7kW rack just won’t cut it.”
Supply chain constraints
Wright says large facilities may aim to plan as far ahead as 2050, but customers may have a relatively short-term view. That is on top of supply chain constraints, particularly on networking equipment, with lead times of 275 days from Cisco or Juniper.
“And if you put in for a power connection request for a datacentre right now and it’s in London, you’re probably looking at 2025 before you get your power allocated,” he adds. “Redesigns and networking are having to happen a bit more on the fly.”
Lewis White, enterprise infrastructure vice-president – Europe at CommScope, agrees that there is more pressure in today’s power-and-network access-centred conversations around capacity.
“Lane speeds have risen from 40Gbps to 100Gbps, even 400Gbps in larger enterprise and cloud datacentres,” he says. “Operators are now deploying optical fibre infrastructures that can support 800Gbps and beyond – going all-in on fibre investment.”
Simon Riggs, Postgres fellow at EDB, points out that squaring monthly performance or annually recurring revenue with a demand for multi-year plans might not sit comfortably beside an agility mantra. Also, accountants rarely tie their calculations to the actual costs of various specific IT solutions and how they are managed.
“I think it’s a little bit cheeky to talk in terms of long durations,” says Riggs. “The original USP in the cloud was that you had flexibility. If you really can predict it years in advance, then why not simply go back to the old datacentre? And it’s happening when people are questioning huge cloud costs.”
Simon Riggs, EDB
Capacity requirements depend on actual volumes of business – and in the past, no one was as worried about the cost of energy. That is why technical problems often occur – such as if a burst happens sooner than a year away and people are running to keep up, says Riggs, suggesting another look at consumption and technology efficiency.
“Really, too much inventory is out there and people aren’t properly tracking what they’re actually doing,” he adds.
Mark Pestridge, senior director – customer experience at colo provider Telehouse, points out that acquiring or building new sites takes years – even just to secure planning permission.
“You have to really build almost floor by floor, suite by suite,” he says. “You’ve just got to continue evaluating what your clients are trying to do and piece it together. It’s like building a jigsaw without all the pieces to start with.”
It can be about ensuring secure interconnection with service providers, telcos, internet peering exchanges and cloud services providers to deliver customer choice, he says. Then it’s about every customer’s different requirements and having the ability to fulfil them more flexibly.
“Yet how do you predict what applications are going to drive [datacentre] adoption?” says Pestridge. “With the way the world is evolving, how can we predict what type of power each rack is going to need? That’s really difficult.”
Adam Bradshaw, commercial director at colo provider ServerChoice, notes that, to an extent, we have all been here before, with the 2008 financial crisis causing a similar move away from on-premise, and again during the pandemic.
“We are seeing a similar thing with this huge exponential increasing of energy costs as well,” he says. “Stuff in AI, with autonomous vehicles, is very power-hungry. They will quite happily take in excess of 20kW of rack, no problem. For more traditional lower-powered hardware, customers just want it somewhere safe.”
Datacentre operators tend to do well in times of major crisis, says Bradshaw. Seen as “safe” places, this can encourage hikes in power densities and pushes to reduce prices and power consumption. Yet so much is dependent on the customer.
Bradshaw recommends a better view of customer requirements in the discovery process. “How old is X piece of kit, what do you want to do with it and how long do you expect to run it for? What does the customer business expect to look like in 12, 24 or 36 months’ time?” he says.
“Drill down into those bits and work out what’s best. But that requires that the prospect to really kind of play ball and work with us and be open to discussing these things.”
Jonathan Bridges, chief innovation officer at cloud services provider Exponential-e, suggests pay-as-you-go, consumption-based wholesale cloud models do not necessarily require much capacity planning, barring trending analysis of customers and the analysis of private, hybrid, or multi-cloud capabilities to keep costs down and boost sustainability.
Get ‘really smart’
He agrees that service providers and datacentres both need to get “really smart” about discovering the patterns in data, in storage and beyond to profile specific customer requirements in more detail.
“We need to be more predictive, looking further around at estate, infrastructure, what’s running in the datacentre, how that will evolve over time and affect capacity,” says Bridges.
That also means continually monitoring utilisation, feeding that more into historical trending, and making more use of descriptive and diagnostic analytics to make decisions, he points out.
“As we advance that, maybe more predictive analytics to try and model what will happen,” says Bridges. “The third thing is take stock of the contracts that you have, and try and do some analysis for when they refresh those contracts. Ask: what is that footprint going to look like?”
Erich Sanchack, chief operating officer at Digital Realty, reveals a focus on supplier-managed inventory agreements with tier-one and tier-two suppliers.
“However, it’s not an easy job, requiring commitment to standards that many providers are not able to establish,” says Sanchack. “Moreover, multi-year planning doesn’t insulate providers from regulatory and external governing factors, which can evolve at the drop of a hat.”
Erich Sanchack, Digital Realty
Simon Bennett, chief technology officer at cloud provider Rackspace, notes that hyperconverged infrastructures are “packing things in” already. Also, the rise of liquid and immersive cooling to manage power densities brings further considerations.
Will building structures cope with the weight and concentration of the racks, and what will that more robust footprint cost?
“You may have to use new facilities and less physical space,” says Bennett. “Then you need a lot of power to go into that small space. If there’s 100kW per rack, suddenly 20 racks is 2MW.”
Several physically smaller facilities might work better than a “massive” datacentre with empty space but where you have used up all the power – yet negatively impacting sustainability and power consumption while increasing reliance on networking and interconnect, he says.
“Whatever capacity you’ve got, you want to drive it hard,” says Bennett. “You don’t want to leave it idle just burning electricity and incurring costs. It’s essential to do your own analytics on your demand profile.
“A lot of people still rely on spreadsheets. You need your own business intelligence around datacentre capacity. Overall, it’s probably about flexibility.”
True to form, Nothing has just announced the full reveal date for its upcoming audio product, Ear (stick).
So, an announcement about an announcement. You’ve got to hand it to Carl Pei’s marketing department, they never miss a trick.
What we’re saying is that although we still have ‘nothing’ conclusive about the features, pricing or release date for the Ear (stick) except an image of another model holding them (and we’ve seen plenty of those traipsing down the catwalk recently), we do have a date – the day when we’ll be granted official access to this information.
That day is October 26. Nothing assures us that on this day we’ll be able to find out everything, including pricing and product specifications, during the online Ear (stick) Reveal, at 3PM BST (which is 10AM ET, or 1AM on Wednesday if you’re in Sydney, Australia) on nothing.tech (opens in new tab).
Any further information? A little. Nothing calls the Ear (stick), which is now the product’s official name, “the next generation of Nothing sound technology”, and its “most advanced audio product yet”.
But that’s not all! Apparently, Ear (stick) are “half in-ear true wireless earbuds that balance supreme comfort with exceptional sound, made not to be felt when in use. They’re feather-light with an ergonomic design that’s moulded to your ears. Delivered in a unique charging case, inspired by classic cosmetic silhouettes, and compactly formed to simply glide into pockets.”
Opinion: I need more than a lipstick-style case
Nothing Ear (stick) – official leaked renders pic.twitter.com/FrhKmRttmiOctober 1, 2022
Aside from this official ‘news’ from Nothing, leaked images and videos of the Ear (stick) have been springing up all over the internet (thank you, developer Kuba Wojciechowski) and they depict earbuds that look largely unchanged, which is a shame.
For me, the focus needs to shift from gimmicks such as a cylindrical case with a red section at the end which twists up like a lipstick. Don’t get me wrong, I love a bit of theater, but only if the sound coming from the earbuds themselves is top dog.
See, that lipstick case shape likely will not support wireless charging. That and the rumored lack of ANC means the Ear (stick) is probably arriving as the more affordable option in Nothing’s ouevre.
For now, we sit tight until October 26.
Becky is a senior staff writer at TechRadar (which she has been assured refers to expertise rather than age) focusing on all things audio. Before joining the team, she spent three years at What Hi-Fi? testing and reviewing everything from wallet-friendly wireless earbuds to huge high-end sound systems. Prior to gaining her MA in Journalism in 2018, Becky freelanced as an arts critic alongside a 22-year career as a professional dancer and aerialist – any love of dance starts with a love of music. Becky has previously contributed to Stuff, FourFourTwo and The Stage. When not writing, she can still be found throwing shapes in a dance studio, these days with varying degrees of success.
You might soon have to buy YouTube Premium to watch 4K YouTube videos, a new user test suggests.
According to a Reddit thread (opens in new tab) highlighted on Twitter by leaker Alvin (opens in new tab), several non-Premium YouTube users have reported seeing 4K resolution (and higher) video options limited to YouTube Premium subscribers on their iOS devices. For these individuals, videos are currently only available to stream in up to 1440p (QHD) resolution.
The apparent experiment only seems to be affecting a handful of YouTube users for now, but it suggests owner Google is toying with the idea of implementing a site-wide paywall for access to high-quality video in the future.
So, after testing up to 12 ads on YouTube for non-Premium users, now some users reported that they also have to get a Premium account just to watch videos in 4K. pic.twitter.com/jJodoAxeDpOctober 1, 2022
It’s no secret that Google has been searching for new ways to monetize its YouTube platform in recent months. In September, the company introduced five unskippable ads for some YouTube users as part of a separate test – an unexpected development that, naturally, didn’t go down well with much of the YouTube community.
A resolution paywall seems a more palatable approach from Google. While annoying, the change isn’t likely to provoke the same level of ire from non-paying YouTube users as excessive ads, given that many smartphones still max out at QHD resolution anyway.
Of course, if it encourages those who do care about high-resolution viewing to invest in the platform’s Premium subscription package, it may also be more lucrative for Google. After all, YouTube Premium, which offers ad-free viewing, background playback and the ability to download videos for offline use, currently costs $11.99 / £11.99 / AU$14.99 per month.
Suffice to say, the subscription service hasn’t taken off in quite the way Google would’ve hoped since its launch in 2014. Only around 50 million users are currently signed up to YouTube Premium, while something close to 2 billion people actively use YouTube on a monthly basis.
Might the addition of 4K video into Premium’s perk package bump up that number? Only time will tell. We’ll be keeping an eye on our own YouTube account to see whether this resolution paywall becomes permanent in the coming months.
Axel is a London-based staff writer at TechRadar, reporting on everything from the newest movies to latest Apple developments as part of the site’s daily news output. Having previously written for publications including Esquire and FourFourTwo, Axel is well-versed in the applications of technology beyond the desktop, and his coverage extends from general reporting and analysis to in-depth interviews and opinion.
Axel studied for a degree in English Literature at the University of Warwick before joining TechRadar in 2020, where he then earned a gold standard NCTJ qualification as part of the company’s inaugural digital training scheme.
USB-C has come a long way since its debut in 2014, now becoming the standard for charging and basic data transfer (on everything except the iPhone, of course!) as well as audio and video for more and more devices. The European Parliament, long enamored with the idea of a consumer- and environmentally-friendly standard for charging devices, is pushing it forward even further. A newly-passed law says that almost all portable electronics will need to charge via USB-C by 2026.
At this point, most new laptops already use USB-C charging, taking advantage of the standard’s flexibility to deliver a range of wattages up to 100 watts. There are two exceptions: the top of the market and the bottom. Cheap budget laptops are still sometimes equipped with less expensive, semi-proprietary barrel charging cables or something like Lenovo’s rectangular charger.
On the other hand, power-hungry laptops that need more than 100 watts still use proprietary connections for their massive adapters. The USB Implementers Forum is working on expanding that limit and some of these laptops can still charge slowly over USB-C. These are the only laptops that Europe will allow to be sold with proprietary chargers after the spring of 2026. While nothing forces manufacturers to follow this new law worldwide, streamlined manufacturing and economy of scale will effectively force the rest of the world to follow in practice if not in legislation.
Parliament posted its reasoning online (spotted by Windows Central), saying that this move will encourage technological innovation and give consumers access to more interoperability with a bonus that more easily-reusable cables and chargers means less electronic waste. The post estimates that it will help consumers save up to 250 million euro a year on new charger purchases.
The bigger news is that this move is likely to finally force Apple to abandon the Lightning connector for the iPhone, cheaper iPads, and a few lingering accessories. (Apple already uses USB-C charging on most iPads and all Macbooks.) The switch for smaller mobile devices will happen by the end of 2024. This includes “all new mobile phones, tablets, digital cameras, headphones and headsets, handheld videogame consoles and portable speakers, e-readers, keyboards, mice, portable navigation systems, earbuds and laptops that are rechargeable via a wired cable.” (Note: This technically creates a loophole for any device that recharges via wireless only.) That should give laptop manufacturers plenty of time to flush out the remaining old-fashioned chargers from their assembly lines.
Michael is a former graphic designer who’s been building and tweaking desktop computers for longer than he cares to admit. His interests include folk music, football, science fiction, and salsa verde, in no particular order.