Feature Stories | 10:00 AM
This story features NEXTDC LIMITED, and other companies. For more info SHARE ANALYSIS: NXT
The company is included in ASX100, ASX200, ASX300, ALL-ORDS and ALL-TECH
FNArena received a bird’s eye view on what it takes to develop critical infrastructure for Australia’s growing demand for cloud computing and AI.
-Size, scale, design and location all matter for data centres
-S3 one of most technologically advanced data centres in Australia
-Accommodating the latest technology from Nivida for AI compute
-Defining NextDC’s moat and future
By Danielle Ecuyer and Rudi Filapek Vandyck
First impressions
Approaching NextDC’s ((NXT)) S3 data centre in Sydney suburb Artarmon, the building is omni-present on the landscape; the first thing that strikes is the absolute scale; the second is how just smart the building looks.
While the design aesthetic might not be the initial thought that comes to mind when visiting a major data centre, it is one of the defining features of the building in front of us.
Nothing from what we encountered in our almost 60-minute site visit skipped a beat when it came to an all-encompassing service offering and experience for customers.
Explained simply, data centres are hotels for data storage and processing across a range of customers, from SMEs to critical telco and government services, as well as cloud service providers, the BigTech hyperscalers: Amazon, Microsoft and Google, and increasingly other players like Oracle and Meta.
Like all hotels, data centres come with different levels of specs, standards, and sizes. If S3 had been built to accommodate humans, it would probably equal a six-star hotel.
In data centre terminology, S3 equates to a Tier IV Uptime facility with a Gold Star Standard, including 80MW capacity across 20,000 m3 of data halls and a rack capacity of 10,800, which is considerably larger than NextDC’s existing S1 at 16MW, S2 at 30MW, and S6 at 13.5MW.
“Gold Operational Sustainability certified is best of breed in terms of its people, procedures, and operations being able to accommodate an industry-leading range of adverse situations that we have already anticipated and planned for, so we remain fully operational” explained Simon Guzowski, Vice President Investor Relations.
Located inside Artarmon’s technology hub, the position is as strategic as the design of the building.
Data centres are categorised as critical pieces of infrastructure due to the potential sensitivity and significance of data housed. Tier IV translates as the “pinnacle of data centre construction worldwide” with complete fault tolerance to infrastructure deployed outside, which means failures in any individual equipment or distribution pathways have no impact on the overall facility.
That also transcends to the building’s location, which cannot be under flight paths or in the vicinity of potentially dangerous industrial assets and manufacturing sites representing fire risk. High industry standards take into account any potential extraneous impacts that can put the building in harm’s way.
Equally important, the data centre must be located to underwrite the optimum level of latency between the digital infrastructure and the customer to ensure seamless and secure access to cloud platforms, carriers, and digital services. Latency refers to the time taken for data to travel from a client (user or device) to the data centre and back.
This is why, for example, Northern Virginia in the USA has become a critical internet and cloud services hub, given the proximity to the Pentagon and other US government buildings.
Everything at and inside S3 is “mission critical”, encompassing the data halls and 1,500m3 of mission-critical space, all backed up by UPS (Uninterruptible Power Supply) and generator power in the event of an electricity grid failure. The 1,500m3 includes critical office space that can be used by customer teams 24/7 or, alternatively, as a disaster recovery site.
The generator power is supplied by a Rolls-Royce MTU 20V4000 DS4000 which is a high-capacity diesel generator set designed for mission-critical applications also including hospitals and industrial facilities. This true monster of a machine (see first image below) delivers up to 4,000 kVA (3.2 MW) of standby power and is engineered for exceptional reliability and efficiency.
All of the above combined allows NextDC to host sensitive applications such as government systems, banking infrastructure, and financial trading platforms. The adjacent S6 facility is Australia’s first data centre designed exclusively for AI factories and sovereign AI.
Pricing for each customer is unique depending on requirements, but put simply; the customer pays for space and power. As the use of power or space rises, so does the price.
Mission Impossible style high security
Security is very tight on entering S3. A photo ID is taken along with a digital fingerprint, referred to as two-factor biometric fingerprint security.
Access past reception is only permitted using a special card accompanied by digital fingerprint recognition via a dual-door process. The card swipe and digital ID open door number one, whereupon one enters an enclosed structure. Once the first door closes, the next door opens to allow entry into the data centre.
Known as a “Bullet-Resistant Biometric Mantrap Portal with Interlocking Doors and Digital ID Authentication”, these structures come with a suite of high security measures, including bulletproof/blast-resistant construction, to prevent unapproved personnel from entering.
Inside the inner sanctum
Having read extensively about racks for data centres, being able to see, and hear the infrastructure offers a level of understanding that words fail to convey.
Racks of storage units, known unsurprisingly as a rack unit which is typically 19 inches wide and accommodates 42 racks, show the Dell brand on prominent display. The racks are stacked in columns which are leased by customers (that largely prefer to remain anonymous).
NextDC provides the containers inside the data halls, with customers filling the containers with IT hardware, such as compute, storage, and security functions.
Specific cooling systems are employed to manage the heat transfer and room temperature to coordinate and balance between energy consumption and efficiency.
NextDC operates the co-location halls at 23 degrees Celsius. For context, co-location refers to the practice of housing privately owned servers and networking equipment in a third-party data centre facility, rather than within a company’s own on-premises infrastructure.
NextDC is in the middle of a program called ‘Project Rise’, which will increase the temperature inside data halls to 25 degrees Celsius. A higher temperature allows for running cooling systems on a more efficient and environmentally friendly setting. As computers become more heat tolerant, NextDC aims to capture the improved energy (power usage) efficiency.
Vertiv in-direct, free cooling, a name some investors may have come across if investing in the US, is provided and paired with modular cooling tower infrastructure.
NextDC has also gone a step further and installed in-house, isolated metal security areas, featuring special access and a higher security spec, to service more sensitive customers, such as government agencies or cloud service providers.
The IT equipment is incredibly dense and heavy, meaning floors and walls are both reinforced to accommodate the weight, and to ensure Tier IV status.
Digging a little deeper, we established the density and weight of IT systems keeps becoming bigger. Nvidia is shipping rack-scale GB200 NVL72 systems. The current flagship system weighs more than 1.3metric tonnes per rack.
For context, the Nvidia GB200 NVL72 system integrates the latest Blackwell architecture including 36 Grace CPUs and 72 Blackwell GPUs within a single rack. It is designed to meet large-scale AI and high-performance computing workloads. One can only imagine the strength and load-bearing capacity of the construction for racks upon racks of these systems being installed.
Although NextDC did not technically quantify the load bearing capacity of each floor, we were informed every level of S3 could be filled with 1.5metres of water and the building would remain structurally sound.
To accommodate clients’ IT equipment, NextDC operates two 3.5-tonne lifts to facilitate customers’ delivery via the secure underground loading docks, over purpose-built pathways to the data hall for quick installation and activation.
While the size, the heat, and the scale of equipment create are one factor, most notable was the noise. Such are the decibels generated in some areas that only special soundproofing headphones can be worn.
Some customers like to run their data halls at high temperatures, such as 28 degrees Celsius, which reduces the amount of power consumed by the cooling systems, but it does require higher levels of air circulation inside the data hall, hence the noise from the fans operating uninterruptedly.
The Extra Touch
Before the tour finishes, a few extra touches for customers are unveiled. S3 houses a suite of services including bookable rooms, dedicated offices, an auditorium, chill-out and break-out suites with kitchen facilities, coffee, TV, lounge, massage chairs, and Foxtel entertainment with a pinball machine, no less.
Snacks and beverages are dispensed from a vending machine, and so are pre-packaged cables in different specs; just in case a customer runs out during installation. Bunnings around the corner is not open 24/7 for data centre customer fitouts.
The point of all these details is to illustrate the extent of the customer experience and the scale, magnitude, and investment required for a structure like S3.
Interiors are curated with NextDC’s signature red and black. The colour red was inspired by founder Bevan Slattery’s fondness of The Hunt for Red October. Slattery first co-founded Pipe Networks. After that business was sold to TPG Telecom in 2010, Slattery used the proceeds to start up NextDC, which listed on the ASX in December 2010.
Slattery departed from the company’s board in 2013 to focus on two of his other investments; Megaport ((MP1)) and Superloop ((SLC)). As suggested by NextDC management, the colour and design really stand out in an industry famous for “tin sheds”.
The building is a perfect example of extreme risk management and quality service offering, ensuring safety, power, and connectivity around the clock with no external inputs, if required.
When the lights go out in a hospital and backup generators ignite, NextDC will still be facilitating communication services, hence internet connectivity and smartphones will keep running uninterrupted.
The company’s construction pipeline currently includes S4, a 300MW IT load, and S7, a 550MW IT load, with both to be located west of Sydney in Horsley Park and Eastern Creek.
Western Sydney is now the industry’s primary expansion zone with all of NextDC, DigiCo Infrastructure REIT ((DGT)), Macquarie Technology ((MAQ)) and unlisted AirTrunk adding additional capacity in the region.
Industry Context
To put S3 and further expansion plans by NextDC in context, we line up all major ASX-listed operators with current assets and future plans (in bold).
In addition, DXN Ltd ((DXN)) is a specialist in modular, edge data centers, focusing on small to mid-scale deployments in industrial, remote, and government settings. It operates Sydney colocation sites and modular pods in mining/Hobart/Government, with flexible, rapid-build model.
Global Data Centre Group’s ((GDC)) investment portfolio includes a Perth facility, a passive equity stake in AirTrunk, and European/Asia investments via Etix. It is moving towards asset realisation and fund returns.
Our take-aways
Data centres are becoming larger and increasingly require more technological input and know-how, but this doesn’t make them a bread-and-butter tech enterprise.
Over in the US, Equinix Inc, the world’s largest owner-operator of data centres and interconnections, is officially labeled a REIT, and thus, according to GICS sector denominations, part of the North-American real estate sector.
NextDC is included in the local All-Tech Index, alongside ‘peers’ such as Life360, Catapult Group, Pro Medicus and Seek, but maybe a more appropriate comparison would be Transurban, Sydney Airport, the NBN, or APA Group, without the regular dividend payments (as the company is still very much in expansion phase).
Two other regularly voiced draw backs about investing in NextDC, and the industry at large, are the high capital intensity as well as the lack of any moat around the core business.
The capital requirements are undeniable, but can also be seen as a natural moat for existing operators, in combination with all the other requirements such as finding suitable location and access to power. Add elongated development lead times and increasing complexity and maybe investors need to update their concept of what makes a ‘moat’ in the modern day context?
NextDC has its own dedicated in-house team responsible for the design of all new data centres.
Another factor that stood out in conversations during our visit is once clients have established themselves inside a data centre, a significant hurdle exists to move elsewhere, even apart from the fact that contracts are typically of a longer-term duration. A typical co-location agreement runs for five years. Hyperscale cloud deals typically span 10-20 years.
High switching barriers means the industry enjoys “sticky” customers and annual recurring revenues. In some cases data centres develop a whole ecosystem with network implications, further adding to the stickiness of customers. As is all too apparent at S3, NextDC is adding reputation, quality, and customer service to further its moat.
Ultimately, the worst case scenario is one that sees more data centres built than is required to meet demand, but just about every expert forecast available to date suggests Australia’s two major hubs Sydney and Melbourne look more at risk of under-supply than otherwise for the next five years at least.
Contributing to such forecasts are practical bottlenecks and long development lead times, while demand is still growing exponentially (it’s called a Megatrend for good reason).
This implies data centres with secure power in Tier-1 locations should continue to enjoy pricing resilience.
The authors are shareholders in NextDC. Danielle through her own SMSF, Rudi via the FNArena-Vested Equities All-Weather Model Portfolio. Images supplied by NextDC. ChatGPT assisted with collating data centres information.
****
For more reading on GenAI and data centres check out FNArena’s dedicated GenAI section
https://fnarena.com/index.php/tag/gen-ai/
and
https://fnarena.com/index.php/2025/05/22/ai-investments-fuel-australias-data-centre-future/
Technical limitations
If you are reading this story through a third party distribution channel and you cannot see images included, we apologise, but technical limitations are to blame.
Find out why FNArena subscribers like the service so much: “Your Feedback (Thank You)” – Warning this story contains unashamedly positive feedback on the service provided.
FNArena is proud about its track record and past achievements: Ten Years On
Click to view our Glossary of Financial Terms
CHARTS
For more info SHARE ANALYSIS: DGT - DIGICO INFRASTRUCTURE REIT
For more info SHARE ANALYSIS: DXN - DXN LIMITED
For more info SHARE ANALYSIS: GDC - GLOBAL DATA CENTRE GROUP
For more info SHARE ANALYSIS: MAQ - MACQUARIE TECHNOLOGY GROUP LIMITED
For more info SHARE ANALYSIS: MP1 - MEGAPORT LIMITED
For more info SHARE ANALYSIS: NXT - NEXTDC LIMITED
For more info SHARE ANALYSIS: SLC - SUPERLOOP LIMITED