Loading


Categories (x) > Cabinets & Racks > Cabinet Accessories > Cooling (x)
Content Type (x) > Black Box Explains (x)

Results 1-6 of 6 1 

Black Box Explains...Power over Ethernet (PoE).

What is PoE?
The seemingly universal network connection, twisted-pair Ethernet cable, has another role to play, providing electrical power to low-wattage electrical devices. Power over Ethernet (PoE) was ratified by the... more/see it nowInstitute of Electrical and Electronic Engineers (IEEE) in June 2000 as the 802.3af-2003 standard. It defines the specifications for low-level power delivery—roughly 13 watts at 48 VDC—over twisted-pair Ethernet cable to PoE-enabled devices such as IP telephones, wireless access points, Web cameras, and audio speakers.

Recently, the basic 802.3af standard was joined by the IEEE 802.3at PoE standard (also called PoE+ or PoE plus), ratified on September 11, 2009, which supplies up to 25 watts to larger, more power-hungry devices. 802.3at is backwards compatible with 802.3af.

How does PoE work?
The way it works is simple. Ethernet cable that meets CAT5 (or better) standards consists of four twisted pairs of cable, and PoE sends power over these pairs to PoE-enabled devices. In one method, two wire pairs are used to transmit data, and the remaining two pairs are used for power. In the other method, power and data are sent over the same pair.

When the same pair is used for both power and data, the power and data transmissions don’t interfere with each other. Because electricity and data function at opposite ends of the frequency spectrum, they can travel over the same cable. Electricity has a low frequency of 60 Hz or less, and data transmissions have frequencies that can range from 10 million to 100 million Hz.

Basic structure.
There are two types of devices involved in PoE configurations: Power Sourcing Equipment (PSE) and Powered Devices (PD).

PSEs, which include end-span and mid-span devices, provide power to PDs over the Ethernet cable. An end-span device is often a PoE-enabled network switch that’s designed to supply power directly to the cable from each port. The setup would look something like this:

End-span device → Ethernet with power

A mid-span device is inserted between a non-PoE device and the network, and it supplies power from that juncture. Here is a rough schematic of that setup:

Non-PoE switch → Ethernet without PoE → Mid-span device → Ethernet with power

Power injectors, a third type of PSE, supply power to a specific point on the network while the other network segments remain without power.

PDs are pieces of equipment like surveillance cameras, sensors, wireless access points, and any other devices that operate on PoE.

PoE applications and benefits.
• Use one set of twisted-pair wires for both data and low-wattage appliances.
• In addition to the applications noted above, PoE also works well for video surveillance, building management, retail video kiosks, smart signs, vending machines, and retail point-of-information systems.
• Save money by eliminating the need to run electrical wiring.
• Easily move an appliance with minimal disruption.
• If your LAN is protected from power failure by a UPS, the PoE devices connected to your LAN are also protected from power failure.
collapse


Black Box Explains…Liquid cooling.

The trend toward high-density installations with higher-powered CPUs has made heat a critical issue in data centers. Blade servers present a special challenge—a rack of blade servers can dissipate more... more/see it nowthan 25 kW, generating more heat than an electric oven.

Heat-generated problems
The heat generated in today’s high-density data centers can shorten equipment lifespan, negatively affect equipment performance, and cause downtime. Traditional air-cooling methods such as hot/cold aisle arrangements simply can’t keep up with these heat-generating installations. Data center managers often try to compensate for the inefficiency of air cooling by under-populating racks, but this wastes space—an often scarce commodity in modern data centers.

Why liquid
Because of the inherent inefficiencies of air cooling, many data centers have turned to liquid cooling through water or other refrigerants. Liquids have far greater heat transfer properties than air—water is 3400 times more efficient than air—and can cool far greater equipment densities.

Liquid cooling is usually done at the rack level using the airflow from the servers to move the heat to a cooling unit where it’s removed by liquid, neutralizing heat at the source before it enters the room. Liquid cooling may also be done at the component level, where cooling liquid is delivered directly to individual components. Liquid cooling may also arrive in the form of portable units for cooling hot spots.

Liquid cooling options
Types of liquid cooling commonly used in data centers include:

  • Cabinet-door liquid cooling: With this method, cooling units are special cabinet doors that contain sealed tubes filled with chilled liquid. The liquid is circulated through the door to remove heat vented by equipment fans. Because liquid-cooled doors can replace standard cabinet doors, they’re the favored method for retrofitting liquid cooling into existing data centers.
  • Integrated liquid cooling: This consists of a specialized sealed cabinet that has channels for liquid cooling built into it to act as heat exchangers. Fans move hot air past the heat exchangers before sending the cooled air back to the servers. These cabinets are closed systems that release very little heat into the room.
  • Component-based liquid cooling: Some servers are preconfigured with integrated liquid-based cooling modules. After the servers are installed, liquid is circulated through the cooling modules.
  • Immersion cooling: This rather counterintuitive cooling method immerses servers in a non-conductive liquid, which is circulated to cool the servers.
  • Portable liquid cooling: These are small units that operate by blowing air across water-cooled coils. They can usually accept water from any source—including a nearby faucet. They’re generally plumbed with ordinary garden hoses and require no special skills to use. Portable cooling units are intended for emergency cooling rather than as a permanent solution.


Liquid cooling requires a shift in the way you think about cooling. Installation may require that you acquire a new skill set or hire a professional installer. However, the space savings and cost savings gained through liquid cooling more than make up for the inconvenience of installing a new cooling technology.

Not only does liquid cooling enable data centers to operate at far greater densities than conventional air cooling does, it gets rid of the infrastructure associated with air cooling, enabling you to eliminate hot/cold aisles and raised floors. Liquid cooling can support from 25 to 80% more equipment in the same footprint, resulting in significantly lower infrastructure costs.

Add to this the fact that cooling is often the majority of a data center’s operating cost, and it’s plain to see why an investment in the efficiency of liquid cooling goes right to the bottom line. collapse


Black Box Explains...NEMA ratings for enclosures.

The National Electrical Manufacturers’ Association (NEMA) issues guidelines and ratings for an enclosure’s level of protection against contaminants that might come in contact with its enclosed equipment.

There are many numerical... more/see it nowNEMA designations; we’ll discuss NEMA enclosures relevant to our on-line catalog: NEMA 3, NEMA 3R, NEMA 4, NEMA 4X, and NEMA 12.

NEMA 3 enclosures, designed for both indoor and outdoor use, provide protection against falling dirt, windblown dust, rain, sleet, and snow, as well as ice formation.

The NEMA 3R rating is identical to NEMA 3 except that it doesn’t specify protection against windblown dust.

NEMA 4 and 4X enclosures, also designed for indoor and outdoor use, protect against windblown dust and rain, splashing and hose-directed water, and ice formation. NEMA 4X goes further than NEMA 4, specifying that the enclosure will also protect against corrosion caused by the elements.

NEMA 12 enclosures are constructed for indoor use only and are designed to provide protection against falling dirt, circulating dust, lint, fibers, and dripping or splashing noncorrosive liquids. Protection against oil and coolant seepage is also a prerequisite for NEMA 12 designation. collapse


Cold aisle containment.

Cold aisle containment (CAC) is a cooling method that increases cooling efficiency and reduces energy costs in data centers.

This cooling method relies on the fact that most network equipment... more/see it nowand servers are designed to cool themselves by drawing air in through the front and exhausting it out the rear. To implement cold aisle containment, rows of cabinets or racks are arranged facing each other to form aisles, and cool air is routed between the rows. Equipment takes the cool air in at the front of the cabinet and exhausts it out the back into the room.

To keep cool air from mixing with warm air, row ends are closed off with an air-flow barrier. This barrier can range from makeshift arrangements of plastic strips to doors made expressly for this purpose.

Because cold aisle containment concentrates cool air at the front of equipment where it’s most needed, it’s an exceptionally effective cooling method. Cold aisle containment significantly reduces energy costs, lowering power bills as well as reducing data centers’ carbon footprints. collapse


Black Box Explains…How to keep cabinets cool.

Networking equipment—especially servers—generates a lot of heat in a relatively small area. Today’s servers are smaller and have faster CPUs than ever. Because most of the power used by these... more/see it nowdevices is dissipated into the air as heat, they can really strain the cooling capacity of your data center. The components housed in a medium-sized data center can easily generate enough heat to heat a house in the dead of winter!

So cool you must, because when network components become hot, they're prone to failure and a shortened lifespan.

Damage caused by heat is not always immediately evident as a catastrophic meltdown—signs of heat damage include node crashes and hardware failures that can happen over a period of weeks or even months, leading to chronic downtime.

Computer rooms generally have special equipment such as high-capacity air conditioning and raised-floor cooling systems to meet their high cooling requirements. However, it's also important to ensure that individual cabinets used for network equipment provide adequate ventilation. Even if your data center is cool, the inside of a cabinet may overheat if air distribution is inadequate. Just cranking up the air conditioning is not the solution.

The temperature inside a cabinet is affected by many variables, including door perforations, cabinet size, and the types of components housed within the cabinet.

The most direct way to cool network equipment is to ensure adequate airflow. The goal is to ensure that every server, every router, every switch has the necessary amount of air no matter how high or low it is in the cabinet.

It takes a certain volume of air to cool a device to within its ideal temperature range. Equipment manufacturers provide very little guidance about how to do this; however, there are some very basic methods you can use to maximize the ventilation within your cabinets.

Open it up.
Most major server manufacturers recommend that the front and back cabinet doors have at least 63% open area for airflow. You can achieve this by either removing cabinet doors altogether or by buying cabinets that have perforated doors.

Because most servers, as well as other network devices, are equipped with internal fans, open or perforated doors may be the only ventilation you need as long as your data center has enough air conditioning to dissipate the heat load.

You may also want to choose cabinets with side panels to keep the air within each cabinet from mixing with hot air from an adjacent cabinet.

Equipment placement.
Don't overload the cabinet by trying to fit in too many servers—75% to 80% of capacity is about right. Leave at least 1U of space between rows of servers for front-to-back ventilation. Maintain at least a 1.5" clearance between equipment and the front and back of the cabinet. And finally, ensure all unused rack space is closed off with blank panels to prevent recirculation of warm air.

Fans and fan placement.
You can increase ventilation even more by installing fans to actively circulate air through cabinets. The most common cabinet fans are top-mounted fan panels that pull air from the bottom of the cabinet or through the doors. For spot cooling, use a fan or fan panel that mounts inside the cabinet.

For very tightly-packed cabinets, choose an enclosure blower—a specialized high-speed fan that mounts in the bottom of the cabinet to pull a column of cool air from the floor across the front of your servers or other equipment. An enclosure blower requires a solid or partially vented front door with adequate space—usually at least 4 inches—between the front of your equipment and the cabinet door for air movement.

When using fans to cool a cabinet, keep in mind that cooling the outside of a component doesn't necessarily cool its inside. The idea is to be sure that the air circulates where your equipment's air intake is. Also, beware of installing fans within the cabinets that work against the small fans in your equipment and overwhelm them.

Temperature monitoring.
To ensure that your components are operating within their approved temperature range, it’s important to monitor conditions within your cabinets.

The most direct method to monitor cabinet temperature is to put a thermometer into your cabinet and check it regularly. This simple and inexpensive method can work well for for small installations, but it does have its drawbacks—a cabinet thermometer can’t tell you what the temperature inside individual components is, it can’t raise the alarm if the temperature goes out of range, and it must be checked manually.

Another simple and inexpensive addition to a cabinet is a thermostat that automatically turns on a fan when the cabinet's temperature exceeds a predetermined limit.

Many network devices come with SNMP or IP-addressable internal temperature sensors to tell you what the internal temperature of the component is. This is the preferred temperature monitoring method because these sensors are inside your components where the temperature really counts. Plus you can monitor them from your desktop—they’ll send you an alert if there’s a problem.

There are also cabinet temperature sensors that can alert you over your network. These sensors are often built into another device such as a PDA but only monitor cabinet temperature, not the temperature inside individual devices. However, these sensors can be a valuable addition to your cooling plan, especially for older devices that don't have internal sensors.

The future of cabinet cooling.
Very high-density data centers filled with blade servers present an extreme cooling challenge, causing some IT managers to resort to liquid-cooled cabinets. They’re still fairly new and tend to make IT managers nervous at the prospect of liquids near electronics, but their high efficiency makes it likely that these liquid-cooled systems will become more prevalent.

It’s easy, really.
Keeping your data and server cabinets cool doesn't have to be complicated. Just remember not to overcrowd the cabinets, be sure to provide adequate ventilation, and always monitor conditions within your cabinets. collapse


Black Box Explains...Choosing a cabinet.

Understanding cabinet and rack measurements.
The main component of a cabinet is a set of vertical rails with mounting holes to which you attach your equipment or shelves. When you consider... more/see it nowthe width or height of a cabinet, clarify whether the dimensions are inside or outside.

The first measurement you need to know is the width of the rails. The most common size is 19 inches with hole-to-hole centers measuring 18.3 inches. There are also 23-inch and 24-inch cabinets and racks. Most rackmount equipment is made to fit 19-inch rails but can be adapted for wider rails.

After width, the most important specification is the number of rack units, abbreviated as “U.” It’s a measurement of space available to mount equipment. Because cabinet width is standard, the amount of space is what determines how much equipment you can actually install. Remember, this is an internal measurement of usable space and is smaller than an external measure of the cabinet or rack.

One rack unit (1U) is 1.75 inches of usable space and is usually, but not always, measured vertically. So, for example, a rackmount device that’s 2U high takes up 3.5 inches of rack space. A rack that’s 20U high has 35 inches of usable space.

Choosing the right cabinet.
Here’s a quick checklist of features to keep in mind before you choose a cabinet for servers or other network devices:
• High-volume airflow.
• Adjustable rails.
• Rails with M6 square holes.
• Moisture and dust resistance.
• Air filters.
• Front and/or rear accessibility.
• Locking doors.
• Left- or right-hinging doors.
• Power strips and cable organizers.
• Interior lighting.
• Preassembly.
• Availability of optional shelves, fans, and casters.
• Cable management rails, space, and knockouts.
• Extra depth to accommodate newer, deeper servers.

Don’t forget to accessorize.
Even if your cabinet is in a climate-controlled room, you may need to add a fan panel to help keep your equipment from overheating. It’s especially important to have ventilation in an enclosed cabinet.

Rackmount power strips mount either vertically or horizontally. Some have widely spaced outlets to accommodate transformer blocks. Some power strips include surge protection.

Mission-critical equipment should be connected to an uninterruptible power supply (UPS). A UPS keeps your equipment from crashing during a brief blackout or brownout and provides you with enough time to shut down everything properly in a more extended power outage.

For accessories that make cabling easier, just take a look at our many cable management products. We have cable management guides, rackmount raceways, horizontal and vertical organizers, cable managers, cable hangers, and much more. collapse

Results 1-6 of 6 1 
Close

Support

Delivering superior technical support is our highest priority. Depending on the products or services we provide for you, please visit your appropriate support area.



 

You have added this item to your cart.

Print
Black Box 1-877-877-2269 Black Box Network Services