Skip to main content

Going Green with IT and applying a tier to the application and infrastructure

Going Green with IT and applying a tier to the application and infrastructure



Moores Law states that the number of transistors on a microprocessor will double about every two years. The law has held up pretty much since Gordon Moore, one of the founders of Intel, first published his paper on the subject in 1965. With the increase in chip capacity came a rise in speed and therefore processing power. We have powerful computers today because of the technology and innovation that drives chip design and production. When the first men landed on the moon it sometimes said that there was less computing power on the spacecraft than there is on a modern mobile phone. So we’ve come a long way in 40 years or so.


For the stand-alone home or office computer or laptop, the amount of energy consumed is large in itself. Now consider taking an individual computer with all it’s associated processor technology and less the peripherals, such as the screen and mouse, and multiplying that by hundreds, maybe thousands of similar devices in the same room. These rooms are euphemistically termed an ’server farm’,  where lines upon lines of individual servers get stacked together to process information. This scenario presents processor designers at the front end and building services engineers at the back end, with the same problem, how to dissipate heat and minimize power requirements. The reason server farms exist in the first place is because our world is becoming more data-driven. And in the world of 24/7 data requirements then the server farm is indeed a practical solution.


Grouped servers or server farms generate huge amounts of heat and because the servers must be kept cool relative to their operating limits then a huge amount of energy must be expended on ventilation and air conditioning. As the energy demand goes up, so too does the cost. And because more and more companies are using these server farms to process and warehouse data, then the demand for both the faster technology and energy is rising in parallel. As the world is increasingly becoming more speed and data-driven, increasing data requirements are driving demand for more server capacity and therefore larger and more complex storage locations.


An interesting read, but a few things come to mind around the green IT space, firstly we need to move the applications to what I call the BIG THREE, Web, Citrix/application streaming and Grid/DataSynapse/Platform, etc. If they aren’t one of these three media, then the possibilities around them are going to limit what we can achieve (excluding the database of course). By that I mean, if I have a proprietary application, which cannot for whatever reason be upgraded to a web platform, be streamed in a Citrix or online java type application, or have the workload converted into a grid type application, we will need to maintain the server, the switch, the storage for the individual application server nodes.


What we need to do is:


Tier the application – how available does the application need to be and to what performance levels – if it’s a train service status in a developing economy that runs three times a day and is accessed by mobile phones, does that need the same level of service as the same application in New York accessed by thousands of iPhone users in rich media?

Tier the data center – do we need that many data centers all acting like tier 1, super cool, super available, load-balanced? Can we not have the data center running at the relevant temperature for the application or availability, if it’s Tier 2 or Tier 3, (by that we mean world ending but not brand affecting or catastrophic), can we not have those data centers slightly hotter on the basis that it might save me millions of pounds for availability that isn’t needed, that by having tier 3 running at 30 degrees, I might save a few million pounds a year in power and cooling but have a marginal effect on availability, and any support cost offset by the power saving. In this case for example, I could have data center 7 (which is 9-5 only) be powered down to low availability on minimal servers at the weekend, and then bring all nodes online on Monday at 7 am in a controlled fashion.

Virtualize the application – abstract it to its component parts in workloads, data feeds and user inputs, so that we can move it around the relevant load-balanced platforms working on a shared infrastructure basis

Virtualize the infrastructure towards shared infrastructure models where I buy the workload, the capacity availability, performance or reliability I need, I only pay on use, on the basis of application availability and reliability.

Virtualize the storage and set standards to offline more data as it becomes less needed online, by that I mean, we need to keep say 30 days trade data on the server disks, the trade data going back three years is wonderful in terms of online ness, but is an inefficient use of power and storage. At the same time, this means we need a backup and recovery process that:



Is on time

Is scalable and enables recovery in hours not those, “Michael’s at lunch, we’ve ordered the tapes, sometime next Thursday”

With a working backup, I could offline more data to cheaper and more energy-efficient storage, it might simply mean tape, it might mean cheaper disk backups for your last 6 months data, everything else on tape, etc – more efficient storage on a per-application basis

Work with deduplication of data – how much space is taken up on the shared storage with user profiles, with static application data or copies of Office or other applications for user access, it might be more efficient operationally, but is this again because it takes too long to rebuild a pc? Without limits on user profiles, we could be copying gigabytes of history, temporary files, and user data around the network which might get backed up several times along the way.

Popular posts from this blog

New Generation Buy Latest SPY Bluetooth Earpiece Set

  In this period new generation wireless devices can be quite favorite devices in addition to all people aware of Wireless product in addition to Within this hugely state-of-the-art technology, our own requires in addition to hope are generally worth raising. Wireless earpiece comes with a two-way connection to the cell phone by means of Wireless. This could be carried out with the use of a criminal Wireless earpiece given that they make it easy for someone in response then call up and never have to make use of both hands especially whilst traveling. The spy earpiece set may be the product created by specialists too fitters in addition to introduced within our technique by way of living. Spy earpiece will be nowadays widely used by Television show provides, reporters, politics numbers, business owners, security authorities, pupils rather than simply by these people smallest Bluetooth earpieces fixed comes with advance built in performance, typically useful for secret businesses. Wirele

Why Nokia X will fail? Here are the 5 reasons

  Nokia recently launched its first Android-based smartphone in India, Nokia X. But, I think the device will not be a hit especially in India. Here are the five reasons why the Nokia X will fail. 1. Poor Specifications The first and main thing is that the device boasts poor specifications. Even though its a budget offering, the specifications and the price of the device is not good enough to attract people. The device is a bit over-priced and looking at the competition in India especially in the budget segment, there are high chances for this Nokia device to fail. 2. Not the usual Android The second reason is that although the device runs on the Android platform, it is not the usual android that people are used to. The user interface looks like a mixture of Android, Windows Phone, and Asha. There was completely no need for a new hybrid considering that there are already many mobile OS in the market. 3. Competition The third reason is that there is a really tough competiti

Rise of Kingdoms: Lost Crusade

Have you ever thought about playing mobile games on PC ? For sure, your first question will be how. In the modern age of the Internet, you could play thousands of games on your PC with the help of BlueStacks – one of the best Android Emulator for your PC. The emulator is absolutely free and it promises to give you better performance. Let’s start with Rise of Kingdom – the best Real-time Strategy game available to play on Mobile Devices like Android and iOS. Now, with the help of BlueStacks, you can experience this game on your PC too. Everyone knows, playing strategy games on PC could be much better than playing on phones that make you limited over a small screen, limited controls, and features.  Download Rise of Kingdoms: Lost Crusade for PC Introduction of Kingdoms: Lost Crusade   Developed by Lilith Games, Rise of Kingdoms (formerly known Rise of Civilizations) lets you control a general who is responsible for defending his kingdom by smashing rival forces. Your epic journey for th