What's it the box Rodos? You can guess, but I am not gonna tell you.
Everyone loves new toys to play with.
Home > July 2009
A day does not go by when we can't learn something new. Love the "What I learned today" posts that Mike Laverick has been doing.
So here is what I learnt today, the right order for DC tiers.
Some details to show the differences1.
Tier 1 - Basic data center with no redundancy
- No redundancy
- Susceptible to disruptions from both planned and unplanned activity
- May or may not have raised floor, UPS or generator
- Shut down likely to perform preventive maintenance
- Annual downtime of 28.8 hours
- Single distribution path with redundant components
- Less susceptible to disruption from both planned and unplanned activity
- Single path for power and cooling disruption, includes redundant components (N+1)
- Includes raised floor, UPS and generator
- Annual downtime of 22 hours
- Enable planned activity without disrupting computer hardware operation, but unplanned events will still cause disruption
- Multiple power and cooling distribution paths but with only one path active, includes redundant components (N+1)
- Includes raised floor and sufficient capacity and distribution to carry load on one path while performing maintenance on the other
- Annual downtime of 1.6 hours
- Planned activity does not disrupt critical load and data center can sustain at least one worst-case unplanned event with no critical load impact
- Multiple active power and cooling distribution paths, includes redundant components (2(N+1)), such as 2 UPS each with N+1 redundancy
- Annual downtime of 0.4 hours
Did you think VMware and the Cloud could help you meet your Data Center Security certifications? Well they can.
I have been reading through the ISO 27001 / ISO 27002 standards on data security. Okay, so some of us don't really have a life. Am I allowed to say its really interesting reading.
So I am reading along and get to section A.10.3 in ISO 27002 which is all about minimising the risk of systems failure, which is about doing capacity planning and system acceptance.
Control A.10.3.1 is all about
The standard requires the organisation to monitor its capacity demands and then to make projections of future capacity requirements so that it can ensure that it has adequate power and data storage facilities available. The utilization of key system resources (file servers, domain servers, e-mail server, printers and other output devices) should be monitored so that additional capacity can be brought on-stream when it is needed. The projections should obviously take account of predictions of levels of business activity, and there should therefore be an overt link between this activity and the annual business planning cycle. The trends that should be consider are the increase in business activity, and therefore in transaction processing; [...] E-commerce businesses should also consider the expected increase in website activity and plan sufficient capacity to ensure that the site remains operational, paritculararly at times of peak activity.Interesting. One can see how the elastic and scalable aspects of VMware and vCloud could drive a long way to organisations being able to show a capacity to mediate this risk and therefore achieve compliance.
All of this should enable network managers and webmasters to identify and avoid potential bottlenecks that could threaten system security or the availability of network or system resources or data.
The Cloud delivering security compliance, now thats an idea!
[Quote from : IT Governance : A Manager's Guide to Data Security and ISO27001/ISO27002 by Alan Calder & Steve Watkins, 4th edition, p175-176]
Over the last six months I have been investing a bit of time every now and again to improve my working methods. If I can do things better and faster then I can start doing better things with my time. Today I wanted to write up one of them, a way to save me much time and frustration each day processing email.
The solution was to write a macro to go off and file emails I have read into my mail folder tree. The key is that its dynamic and works out where to put messages based on the sender. Sweet.
Some background. Everyone has a different method of handling their email. Some people think they have no method, that just means they have a bad one. My method is that an email stays in my email box until its processed and can go to archive. I try to use a GTD method so most don't stay there for that long. The exception to this rule is I do have a folder under my Inbox called Process, where I move messages that just need some mindless activity that I must do at some stage, I troll this folder every few days when I have a short period of time to kill.
Now I also never delete emails unless they are really spam, advertising or addressed wrong, its amazing how that email you never thought important becomes critical after six months. Many people are like me and file mail away in folders that categorise them. A folder tree works great and anables you to filter mail and perform searches. I have folders for each of the companies that I deal with and for colleagues. This method was great back in the day when the search function in Outlook was slow and painful, it really helped to speed things up.
However I find I spend a lot of time filing messages, there are probably 80 or so folders and even though there are a few speed tricks for moving messages they never really make it truly effective. As you get more and more messages a day that need a scan and file, this time really adds up.
I figured there are three possible solutions to the problem.
- First you could pre-filter messages using Outlook rules. Many people do this, but to find messages you need to walk the tree and look for unread ones, or use the new search function to list all unread messages. Thats all too messy and will not work great when working between my Blackberry and Outlook.
- The second option is to just dump everything into a very small set of folders, then use search to categorise. This is close but I am not ready to give up my folder structure just yet.
- The last option is to automate it. I figured it was a common task and should be simple.
There are quite a few example Macros on the Internet for moving an email to another folder. Most don't work and none were dynamic. So here is what the Macro I wrote does.
Take the selected message and determine a destination folder name based on the sender. Walk the Mail folder tree and find a folder that matches the destination folder name, mark the message as read and move it there. To determine the destination folder name I use two methods. If its an Exchange message I use the first and last name combination. If its an SMTP message, that is external, I pull the domain name. Some external messages I actually want to file by name, so I have it use the first and last name instead for those specific domains.
Now all I need to do is make sure there is a folder somewhere within my tree that matches the domain name or the user name. If it can't find a match the message just stays in the Inbox. It works great!
I created a button with a keyboard shortcut so now as I read my Inbox, if I don't need to do anything further on a message I just hit Alt-X and it disappears into the folder tree. The first email from a company or internal person I just create a folder wherever is appropriate in my folder tree. So for the domain name filed messages I create parent folders that group them, for example Customers and Suppliers. I can move the folders around and it all still works.
Here is the code as it stands today.
Const olFolderInbox = 6
Dim objItem As Outlook.MailItem
Dim objMailbox As Outlook.MAPIFolder
Dim FolderToSendTo As String
Set objOutlook = CreateObject("Outlook.Application")
Set objNamespace = objOutlook.GetNamespace("MAPI")
Set objInbox = objNamespace.GetDefaultFolder(olFolderInbox)
strFolderName = objInbox.Parent
Set objMailbox = objNamespace.folders(strFolderName)
Set colItems = objInbox.Items
If objOutlook.ActiveExplorer.Selection.Count = 0 Then
'Require that this procedure be called only when a message is selected
For Each objItem In objOutlook.ActiveExplorer.Selection
If objItem.Class = olMail Then
' SenderEmailType is "EX" or "SMTP"
FolderToSendTo = ""
If objItem.SenderEmailType = "SMTP" Then
' Lets determine the domain name to file in under
atLoc = InStr(objItem.SenderEmailAddress, "@")
DomainName = Right(objItem.SenderEmailAddress, Len(objItem.SenderEmailAddress) - atLoc)
If LCase(DomainName) = "vmware.com" Then
DomainName = Left(objItem.SenderEmailAddress, atLoc - 1)
DomainName = Replace(DomainName, ".", " ") ' Replace dots with spaces
DomainName = Replace(DomainName, "'", "") ' Replace quotes with nothing
FolderToSendTo = DomainName
ElseIf objItem.SenderEmailType = "EX" Then
'MsgBox objItem.SenderName, vbOKOnly + vbExclamation, "INFO"
FolderToSendTo = objItem.SenderName
MsgBox "Don't know what to do, unknown SenderEmailType (not EX or SMTP)", vbOKOnly + vbExclamation, " OPPS"
' MsgBox "Will look for folder " + FolderToSendTo, vbOKOnly, "INFO"
FindFolder objMailbox, objItem, FolderToSendTo
Set objItem = Nothing
Set objFolder = Nothing
Set objInbox = Nothing
Set objNS = Nothing
Sub FindFolder(oFolder As Outlook.MAPIFolder, theMessage As Outlook.MailItem, theFolderToFind)
Dim folders As Outlook.MAPIFolder
Dim iFolder As Outlook.MAPIFolder
Dim foldercount As Integer
Set theFolders = oFolder.folders
foldercount = theFolders.Count
'Check if there are any folders below oFolder
If foldercount Then
For Each iFolder In theFolders
If theFolderToFind <> "" Then
' Debug.Print iFolder.FolderPath + " ^ " + theFolderToFind
If InStr(LCase(iFolder.FolderPath), "\" + LCase(theFolderToFind)) Then
' Move it to the final location!
theMessage.UnRead = False
theFolderToFind = ""
FindFolder iFolder, theMessage, theFolderToFind
I have not done any programming for years (after doing it for the first 10 years of my career), so this was a fun exercise. There are a few tweaks I am thinking of making. But this should give people a good starting place. If I make any big changes I will update the code here.
You could go to town on algorithms on which folder to put messages in, that can be left as an exercise for the reader. If you come up with anything interesting or find this useful post in the comments.
More documentation is coming out for UCS on the Cisco sites. As mentioned by Steve the other day a lot of the documentation is now becoming available.
Further to what has been mentioned I notice that the "Release Notes for Cisco UCS Release 1.0(1e)" are out. You will need a CCO login to read these though, unlike the public documentation. Looks like some documents go into both locations and others do not, can't see any rhyme or reason to as why yet. Of course then there are the documents in the Partner Resource Center. Reminds me of my post "Does VMware have too many locations for technical materials?"
The release notes for UCS have all sorts of juicy details about little things that don't work at the moment. The workarounds to some of these are funny as is often the case. Some examples.
Symptom : When several KVM Consoles are launched, the SUN JRE sometimes reports an error and the KVM Console fails to launch.
Workaround : Launch the KVM Console again.
Symptom : The vNIC templates are not exported when you backup all system and logical configuration settings (the All Configuration option).
Workaround : Create the vNIC templates after you import the configuration.
Symptom : On a system with five or more chassis, the following sequence of events causes the system to not be HA ready for up to five minutes:
1. Discover all chassis
2. Wait for HA READY
3. Decommission all chassis
4. Recommission all chassis
Workaround : Wait for HA READY.
I will keep my UCS Resources page updated with the pertinent links.
Could the Google OS become the platform for delivering Desktop As A Service (DaaS) or VDI from the Cloud?
Google have announced their Google Chrome OS which is to be the OS users have been waiting for.
Google Chrome OS is being created for people who spend most of their time on the web, and is being designed to power computers ranging from small netbooks to full-size desktop systems.
Providing desktop from the cloud is currently fraught with Microsoft licensing restrictions. Its worth looking into seeing how the Google OS could move beyond these problems as well as provide an OS that may scale and perform better.
Thin clients may not have the grunt over their required 5 year lifetime to run the Google OS directly, but executed out of the cloud via the emerging protocols such as PCoIP you can really see this fly.
Even if all it achieved was Microsoft relaxing their licensing, it would be a good thing for the cloud industry.
Certainly something to keep an eye on over the next 12 months.
Cisco are running their annual conference, Cisco Live in San Francisco and its their 20th birthday. Its huge with around 10,000 people I think and its up there with VMworld on interesting events for the year.
Some great bloggers are doing some daily video summaries which you can see at the Cisco DCN blog. They have Omar, Urquhart, the Hoff, Chad et al giving their views.
If you do the free registration for the Live Virtual you can watch many of the sessions on demand including the keynotes. I just finished watching John Chambers do his usual preacher session style which everyone seams to love.
Tomorrow is Padmasree Warrior, Cisco's chief technology officer. Like the contrast between Paul Maritz and Stephen Herrod at VMworld, the tag team between the business and the technical, we should see the difference between John and Padmasree.
However it looks like Padmasree has done the analyst briefings on her session already, as there have been many reports out today on where Cisco are heading with the cloud.
Just some of them are
- Cisco cuddles all clouds but one
- Cisco Won't Take on Amazon in Cloud
- Cisco’s Cloud: Services, Not Infrastructure
- Cisco Launches Services, Shows Off Its Hit List
For the rest of the cloud they are going to continue as they have always done, being an arms dealer to the rest of the industry. Cisco see the Unified Computing System (UCS) as one of the key infrastructures for carriers and hosting companies to take up to build out their clouds. Just like Cisco did in the Internet boom of the late 90's and early part of this century where they cleaned up providing kit to the Dot-com bubble, they are going to do it again in the cloud space. This is all similar to when you visit the historical mining towns, where you hear about the poor miners who did it hard, whilst the merchants selling them the shovels and pans cleaned up.
Be interesting to watch the full presentation from Padmasree when its available tomorrow. Its sure to be cloudy and worth watching.
P.S. One session that was on today which would be great to review would have be James Urquhart on Intercloud, but its not available for free attendees, oh well.
ITMATO-2601 : Achieving the Intercloud: Trust and Interoperability in Federated Cloud Computing Markets
The role of the Internet in the future of Cloud Computing is one of today´s unanswered questions. While much of the conversation has been about "speeds and feeds", the need for network and computing services to enable evolution from siloed clouds to an open and federated "Intercloud" is just beginning to be understood. This talk will explore the concept of the "Intercloud", and the importance of trust and open interfaces to those who leverage it. Other topics include a roadmap from static data center architectures to truly dynamic "intercloud" architectures, as well as what enterprises can do today with virtualization and cloud technologies to prepare for that transition.
James Urquhart, Market Manager, Cloud Computing and Virtualization - Cisco
- With over 20 years working in the IT industry I have had varied sub careers. My first decade was as a programmer, developing applications whilst working and living in Asia. There was the obligatory dotcom involvement in a fun start up. Working in the SI space I loved being able to work at integrating many various technologies and solving a wide variety of IT problems. Falling in love with server virtualization caused me to become involved in Cloud Computing which became a great passion due to how much it could help IT do greater things. Today I spend my time assisting a large team of Solutions Architects across A/NZ at Amazon Web Services. Just like everyone at Amazon I enjoy working hard, try to have some fun and hope to be a small part of making history.
- ► 2011 (16)
- ► 2010 (57)
- ▼ July (7)
- ► 2008 (38)
- 3Par (1)
- Amazon (1)
- AML (2)
- API (1)
- Apple (2)
- AWS (18)
- Backup (2)
- BigData (2)
- Blog (2)
- bloggers (1)
- Books (1)
- Cisco (42)
- cloud (88)
- CloudWatch (1)
- Console (2)
- Crashpan (1)
- Dell (4)
- Developer (1)
- DevOps (1)
- DR (1)
- Drobo (2)
- EC2 (1)
- EMC (7)
- F5 (1)
- FCoE (1)
- Fusion (1)
- Gestalt (12)
- HDS (3)
- HP (5)
- HPStorageDay (1)
- IBM (8)
- Interviews (2)
- iPad (1)
- ITNews (1)
- KIQ (1)
- Lambda (2)
- Logs (1)
- Machine Learning (1)
- NBN (1)
- Netapp (4)
- Networking (1)
- node.js (1)
- Ocarina (1)
- RabbitMQ (1)
- re:Invent (2)
- Ruby (3)
- S3 (1)
- Security (1)
- SIM (1)
- SNIA (9)
- SNS (1)
- SPLA (1)
- Spot (1)
- SpringSource (1)
- SRM (2)
- Stacks (2)
- Storage (14)
- Symantec (3)
- tdocs (1)
- UCS (30)
- VCE (3)
- vChampions (1)
- Veeam (1)
- VFD2 (3)
- vForum (4)
- viops (1)
- Vizioncore (1)
- VM (1)
- VMware (148)
- VMworld (16)
- VPC (1)
- Xangati (1)
- XIV (2)
- Zerto (1)