Thoughts on working inside a data center suite
I have a few more thoughts on the whole topic of colocation. First of all, Joel wrote in with a couple of tips beyond the basic "screwdriver and flashlight" that I mentioned. He says you should bring hearing protection, a step-stool or small ladder, and a jacket if you get cold. I like this thinking, and figured I'd expand on this for the benefit of those wondering what this all means.
First up, these places are LOUD. Everything you can imagine has fans on it. Obviously there are massive air handlers in the suites, but the (proper server class) computers and switches and everything else are also rocking a ton of fans. Some of them throttle back when the CPU load isn't too high, but a fair number of these things actually have rather high CPU load and so they never throttle back.
I mean, it's 2024, and people are writing CPU-bound computational stuff in languages that are interpreted, single-threaded, and slow as shit. OF COURSE they're running their CPUs as hard as they possibly can. But I digress.
So yes, it's loud as hell, and you could benefit from some kind of active protection. Just don't do what I did one time by cranking up the music in regular earbuds to cover the noise. Yeah, you might be able to hear your tunes, but you won't be hearing much else afterward. Stupid move, I know... now.
The step-stool or small ladder is not always a given. My particular cabinet isn't super tall and I'm able to reach all of it, but this might not apply for everyone else. Alternatively, you might need to use it to support something from below while doing an install, especially if you're flying solo.
The jacket is another one of those things that you might not appreciate until you've been on the inside. It's not like the 90s when everyone just had a giant room and did their very best to cool the entire thing down as far as possible. These days, there are "hot aisles" and "cold aisles": two rows of machines face each other across a walkway, and cold air is blown into there. Then the air goes through the machines thanks to all of those fans and ends up on the back side where it joins with hot air from yet another row that's also backed up to it. Finally, it's drawn into a chiller to start the process over again.
If it's a solid concrete floor type of setup, then the air will have to come from above, but if you're on a raised floor, it'll probably just emerge from beneath. If you're in that kind of setup, make note of what you wear before making a trip lest you turn into Marilyn Monroe.
(Yeah, I just gave some advice that only applies to a subset of the people who will ever read this and need it. Yep. That happened.)
The nature of things is that you will inevitably need to access your rack from both sides to reach certain parts of the equipment, so it will be really hot sometimes, and it'll be really cold other times. This is why I will amend that advice to "bring a jacket that zips".
Now, if you're in the Bay Area like me, you have hopefully long ago internalized the wisdom that "you will never be far from a light sweater", and so you *already* have one of those that goes everywhere with you. If so, you're set. Be ready to adjust it as appropriate.
Other stuff? Once you're past the "drop a Raspberry Pi in there" stage, you should not just jump on ebay and buy the first rack mounted server that looks like it'll work because you hate hardware and want it over with. This is because not all racks and not all servers are created equal, and you might find out the hard way that it won't work. I came super close to screwing up *hard* due to this.
I would recommend first going in with a tape measure to figure out exactly what you're working with. What are the posts like in the rack? Are they fixed in place or can they be adjusted? Are they round holes or square holes? How long can a server be while still fitting into the rack? Will the door(s) still be able to close and latch?
Bear in mind that it's not just the length of the machine, but you also have to include the loops of cables which emerge from it: power and Ethernet at the very least. You really don't want to force a regular power or network cable into a "hard 90" type of situation to make it all fit into the space. With a little research, you can get power cords which have their own built-in 90 degree turn on the end where it plugs into the machine that will let you claw back a bit of space, and likewise for the network stuff.
Or, you know, you can just buy a shorter machine and use normal cables.
Measure twice, buy once.
Another complication: if your provider gives you power by way of a "zero U" PDU (basically a big power strip that stands up vertically), that's both a plus and a minus. It's a plus in that you're not burning any rack units by definition: it's "zero U". But, it can be a problem because it still takes up space, and if you have anything really long in there, it'll probably bump into it. This constrains you to only using spots in the rack which are not blocked by the sneaky little PDU. It's just another 3-D Tetris problem for you to solve.