Black Box Explains...What to consider when choosing a rack.
Product Data Sheets (pdf)...23"-to-19" Rackmount Adapters
Product Data Sheets (pdf)...Action Organizer
Product Data Sheets (pdf)...Rackmount Media Storage Drawers and Partitions
Product Data Sheets (pdf)...Sliding Drawers
There are several things you should consider when choosing a rack.
What kind of equipment will you be putting in it? If you need frequent access to all sides of... more/see it nowthe equipment, an open rack is more convenient than a cabinet. If your equipment needs ventilation, a rack poses no air circulation limitations. And don’t neglect aesthetics. Will customers or clients see your installation? A rack with cable management looks much neater.
Finally, consider security. Because a rack is open, you need to take steps to secure your equipment. Set up your rack in a locked room so prying fingers can’t access your network equipment.
Racks come in various sizes and installation styles. Some are freestanding; some are designed to be wallmounted. Some can be a combination of both styles, sitting on the floor but attaching to the wall for more stability.
Understanding rack measurements.
The main component of a rack is a set of vertical rails with mounting holes to which you attach your equipment or shelves.
The first measurement you need to know is the width between the two rails. It’s commonly given in inches, measured from one mounting hole to the corresponding hole on the opposing rail. The most common rail width is 19"; 23" rails and racks are also available. Most rackmount equipment is designed to fit 19" rails but can be adapted for wider racks.
The next important specification is the number of rack units, which is abbreviated as “U.” This is a measurement of the vertical space available on the rails. Cabinets and racks and rackmount equipment are all measured in rack units. One rack unit (1U) is equal to 1.75" of usable vertical space. So, for example, a device that’s 2U high takes up 3.5" of rack space. A rack that’s 20U high has 35" of usable space.
Because the widths are standard, the amount of vertical space is what determines how much equipment you can actually install. Remember this measurement of usable vertical space is smaller than the external height of the rack.
Getting power to your equipment.
Unless you want to have a tangle of extension cords, you’ll need to get one or more power strips for your rack. Consider which kind would be best for your installation. Rackmount power strips come in versions that mount either vertically or horizontally. Some have outlets that are spaced widely to accommodate transformer blocks—a useful feature if most of your equipment uses bulky power transformers.
Surge protection is another important issue. Some power strips have built-in surge protection; some don’t. With the money you have invested in rackmount equipment, you’ll certainly want to make sure it’s protected.
Any mission-critical equipment should also be connected to an uninterruptible power supply (UPS). A UPS prevents your equipment from crashing during a brief blackout or brownout and allows enough time to shut everything down properly in the event of an extended power outage. Choose a rackmount UPS for the most critical equipment or plug the whole rack into a standalone UPS.
Your equipment may look very tidy when it’s all mounted. But unless you’re very careful with your cables, you can create a tangle you’ll never be able to unravel.
Plotting your connections in advance helps you to decide the most efficient way to organize the cables. Knowing where the connections are tells you whether it’s better to run cables horizontally or vertically. Most network problems are in the cabling, so if you let your cables get away from you now, you’re sure to pay for it down the road.
There are many cable management accessories that can simplify your racks. collapse
Black Box Explains…How to keep cabinets cool.
Product Data Sheets (pdf)...Comm Cabinets
Networking equipment—especially servers—generates a lot of heat in a relatively small area. Today’s servers are smaller and have faster CPUs than ever. Because most of the power used by these... more/see it nowdevices is dissipated into the air as heat, they can really strain the cooling capacity of your data center. The components housed in a medium-sized data center can easily generate enough heat to heat a house in the dead of winter!
So cool you must, because when network components become hot, they're prone to failure and a shortened lifespan.
Damage caused by heat is not always immediately evident as a catastrophic meltdown—signs of heat damage include node crashes and hardware failures that can happen over a period of weeks or even months, leading to chronic downtime.
Computer rooms generally have special equipment such as high-capacity air conditioning and raised-floor cooling systems to meet their high cooling requirements. However, it's also important to ensure that individual cabinets used for network equipment provide adequate ventilation. Even if your data center is cool, the inside of a cabinet may overheat if air distribution is inadequate. Just cranking up the air conditioning is not the solution.
The temperature inside a cabinet is affected by many variables, including door perforations, cabinet size, and the types of components housed within the cabinet.
The most direct way to cool network equipment is to ensure adequate airflow. The goal is to ensure that every server, every router, every switch has the necessary amount of air no matter how high or low it is in the cabinet.
It takes a certain volume of air to cool a device to within its ideal temperature range. Equipment manufacturers provide very little guidance about how to do this; however, there are some very basic methods you can use to maximize the ventilation within your cabinets.
Open it up.
Most major server manufacturers recommend that the front and back cabinet doors have at least 63% open area for airflow. You can achieve this by either removing cabinet doors altogether or by buying cabinets that have perforated doors.
Because most servers, as well as other network devices, are equipped with internal fans, open or perforated doors may be the only ventilation you need as long as your data center has enough air conditioning to dissipate the heat load.
You may also want to choose cabinets with side panels to keep the air within each cabinet from mixing with hot air from an adjacent cabinet.
Don't overload the cabinet by trying to fit in too many servers—75% to 80% of capacity is about right. Leave at least 1U of space between rows of servers for front-to-back ventilation. Maintain at least a 1.5" clearance between equipment and the front and back of the cabinet. And finally, ensure all unused rack space is closed off with blank panels to prevent recirculation of warm air.
Fans and fan placement.
You can increase ventilation even more by installing fans to actively circulate air through cabinets. The most common cabinet fans are top-mounted fan panels that pull air from the bottom of the cabinet or through the doors. For spot cooling, use a fan or fan panel that mounts inside the cabinet.
For very tightly-packed cabinets, choose an enclosure blower—a specialized high-speed fan that mounts in the bottom of the cabinet to pull a column of cool air from the floor across the front of your servers or other equipment. An enclosure blower requires a solid or partially vented front door with adequate space—usually at least 4 inches—between the front of your equipment and the cabinet door for air movement.
When using fans to cool a cabinet, keep in mind that cooling the outside of a component doesn't necessarily cool its inside. The idea is to be sure that the air circulates where your equipment's air intake is. Also, beware of installing fans within the cabinets that work against the small fans in your equipment and overwhelm them.
To ensure that your components are operating within their approved temperature range, it’s important to monitor conditions within your cabinets.
The most direct method to monitor cabinet temperature is to put a thermometer into your cabinet and check it regularly. This simple and inexpensive method can work well for for small installations, but it does have its drawbacks—a cabinet thermometer can’t tell you what the temperature inside individual components is, it can’t raise the alarm if the temperature goes out of range, and it must be checked manually.
Another simple and inexpensive addition to a cabinet is a thermostat that automatically turns on a fan when the cabinet's temperature exceeds a predetermined limit.
Many network devices come with SNMP or IP-addressable internal temperature sensors to tell you what the internal temperature of the component is. This is the preferred temperature monitoring method because these sensors are inside your components where the temperature really counts. Plus you can monitor them from your desktop—they’ll send you an alert if there’s a problem.
There are also cabinet temperature sensors that can alert you over your network. These sensors are often built into another device such as a PDA but only monitor cabinet temperature, not the temperature inside individual devices. However, these sensors can be a valuable addition to your cooling plan, especially for older devices that don't have internal sensors.
The future of cabinet cooling.
Very high-density data centers filled with blade servers present an extreme cooling challenge, causing some IT managers to resort to liquid-cooled cabinets. They’re still fairly new and tend to make IT managers nervous at the prospect of liquids near electronics, but their high efficiency makes it likely that these liquid-cooled systems will become more prevalent.
It’s easy, really.
Keeping your data and server cabinets cool doesn't have to be complicated. Just remember not to overcrowd the cabinets, be sure to provide adequate ventilation, and always monitor conditions within your cabinets.
Product Data Sheets (pdf)...Vent Panels with Vertical Slots
Elite Server-Mount Cabinet Heavy-Duty Solid Shelf User Manual
User Manual for the RM590-R2 and RM591-R2 (Version 2)