J Wolfgang Goerlich's thoughts on Information Security
Cloud adoption and use

By wolfgang. 30 March 2012 18:14

I am tremendously in favor of virtualization, a staunch proponent for cloud computing, and I’d automate my own life if I could. After all, we dedicated most of last year to investigating and piloting various cloud backup solutions. But take a peek at my infrastructure and you might be surprised. Why is my team still running physical servers? Why are we using so few public resources? And tape, really?

I am not the only one who is a bit behind on rolling out the new technology. Check out this study that came out on Forbes this week. "The slower adoption of cloud ... reflects a greater hesitancy ... remain conservative about putting mission-critical and customer data on the cloud. Regulations ... may explain much of this reluctance. The prevalence of long-established corporate data centers with legacy systems throughout the US and Europe ... may be another factor. Accordingly, the study confirms that overcoming the fear of security risks remains the key to adopting and benefiting from cloud applications."

I have a sense that cloud computing, in the IaaS sense, is roughly where virtualization was circa 2004. It is good for point solutions. Some firms are looking at it for development regions. Now folks are beginning to investigate cloud for disaster recovery. (See, for example, Mark Stanislav's Cloud Disaster Recovery presentation.) These low risk areas enable IT management to build competencies in the team. A next step would be moving out tier 3 apps. A few years after that, the mission-critical tier 1 apps will start to move. This will happen over the next five to eight years.

This logical progression gives the impression that I see everything moving to the cloud. As Ray DePena said this week, "Resist the cloud if you must, but know that it is inevitable." I can see that. However inevitable cloud computing is, like virtualization, it does not fit all use cases.

Why are some servers still physical? In large part, it is due to legacy support. Some things cannot be virtualized and cannot be unplugged, without incurring significant costs. In some cases, this choice is driven by the software vendor. Some support contracts still mandate that they cover only physical servers. Legacy and vendors aside, some servers went physical because the performance gains outweigh the drawbacks. Decisions, decisions.

The majority of my environment is virtualized and is managed as a private cloud. Even there, however, there are gaps. Some areas are not automated and fully managed due to project constraints. We simply have not gotten there yet. Other areas probably will never be automated. With how infrequent an event occurs, and with how little manual work is needed, it does not make sense at my scale to invest the time. This is a conscious decision on where it is appropriate to apply automation.

Why are we not using so more public resources? Oh, I want to. Believe me. Now I am not keen on spending several weeks educating auditors until cloud reaches critical mass and the audit bodies catch up. But the real killer is costs. For stable systems, the economics do not make sense. The Forbes article points out that the drivers of public cloud are "speed and agility -- not cost-cutting." My team spent ten months in 2011 trying to make the economics work for cloud backup. Fast forward a half of a year, and we are still on tape. It is an informed decision based on the current pricing models.

Is cloud inevitable? The progression of the technology most surely is, as is the adoption of the technology in areas where it makes sense. The adoption curve of virtualization gives us some insight into the future. Today, there are successful firms that still run solely on physical servers with direct attached storage. Come 2020, as inevitable as cloud computing is, it is equally inevitable that there will be successful firms still running on in-house IT.

Many firms, such as mine, will continue to use a variety of approaches to meet a variety of needs. Cloud computing is simply the latest tactic. The strategy is striking the right balance between usability, flexibility, security, and economics.

Wolfgang

Side note: If you do not already follow Ray DePena, you should. He is @RayDePena on Twitter and cloudbender.com on the Web.

Tags:

Architecture | Virtualization

Peer Incites next week

By wolfgang. 27 February 2012 07:02

I will be on Peer Incites next Tuesday, March 6th, for a lunch time chat on team management. The talk is scheduled for 12-1pm ET / 9-10am PT.

DevOps -- the integration of software developement and IT operations -- is a hot topic these days. In my current role, I took on IT operations in 2008 and took on software development in 2010. I have been driving the combined team using value proposition lens of the nexus of passion, skillsets, and business value. Add to this my favorite topic, training and skill hops, and we get a winning mix for leading a productive DevOps team.

I will dig into the nuts-and-bolts next Tuesday. Details are below. Hope you can join us.

Wolfgang

Mar 6 Peer Incite: Achieving Hyper Productivity Through DevOps - A new Methodology for Business Technology Management

By combining IT operations management and application development disciplines with highly-motivating human capital techniques, IT organizations can achieve amazing breakthroughs in productivity, IT quality, and time to deployment. DevOps, the intersection of application development and IT operations, is delivering incredible value through collaborative techniques and new IT management principles.

More details at:
http://wikibon.org/wiki/v/Mar_6_Peer_Incite:_Achieving_Hyper_Productivity_Through_DevOps_-_A_new_Methodology_for_Business_Technology_Management

 

Tags:

Architecture | Out and About | Project Management

Comments on Cloud computing disappoints early adopters

By wolfgang. 4 October 2011 12:01

Symantec surveyed several businesses to find out how they felt about cloud computing. The standard concerns about security were expressed. Still no concrete statistics on the difference between the threat exposure of in-house IT versus the threat exposure of public cloud IT. The concern about expertise surprises me, however, as managing a cloud environment is only slightly different than managing an enterprise data center. I have a hunch that it may be IT managers protecting their turf by claiming their guys don't have the expertise, but I may be off. So what's going cloud? Backups, security, and other non-business apps. No surprise there. Give it a few more years yet.

"While three out of four organizations have adopted or are currently adopting cloud services such as backup, storage and security, when it comes to the wholesale outsourcing of applications there is more talk than action, Symantec found. Concerns about security and a lack of expertise among IT staff are the main factors holding companies back, according to the survey of 5,300 organizations ..."

Cloud computing disappoints early adopters
http://www.reuters.com/article/2011/10/04/us-computing-cloud-survey-idUSTRE7932G720111004

 

Tags:

Architecture

Private clouds, public clouds, and car repair

By wolfgang. 30 September 2011 15:01

I am getting some work done on one of my cars. I never have any time. I rarely have any patience. Occasionally, I occasionally have car troubles. So into the dealership I go.

Every time, I hear from my car savvy friends and coworkers. The dealership takes too long. The dealership costs too much. If there is anything custom or unique about your vehicle, it throws the dealership for a loop.

Sure. Doing it yourself can be faster and cheaper. But if, and only if, you have the time, tools, and training. Short of any of these three, and the dealership wins hands down. If you are like me, then you have no time and no tools more complex than pliers and a four bit screwdriver set.

What does this have to do with cloud computing? Well, it provides a good metaphor for businesses and their IT.

Some businesses have built excellent IT teams. Their teams have the time to bring services online, and to enable new business functionality. These are the businesses that equip their IT teams with the tools and provide the training. Hands down, no questions asked, these teams will deliver solutions with higher quality. These IT teams can do it in less time and for less cost.

Other businesses have neglected IT. These are the teams that are told to keep the lights on and maintain dial-tone. Their IT systems are outdated. Possibly, their personnel has outdated skillsets. It makes as much sense for the internal IT teams to take on infrastructure projects as it does for me to change out my transmission. The costs, efforts, and frustration will be higher. The quality? Lower.

These are two ends of the spectrum, of course. Most IT teams are a mix. They are strong in some areas, and weak in others.

I suggest we play to our strengths. Businesses look to enable new functionality. Like with car repairs, we can step back and consider. Does our team have the time, tools, and training in this area? What will bring the higher quality and lower costs? That’s the way to decide build versus buy and the our cloud versus public cloud questions.

Tags:

Architecture

Cost justifying 10 GbE networking for Hyper-V

By wolfgang. 21 September 2011 05:13

SearchSMBStorage.com has an article on 10 GbE. My team gets a mention. The link is below and on my Press mentions page.

For J. Wolfgang Goerlich, an IT professional at a 200-employee financial services company, making the switch to 10 Gigabit Ethernet (10 GbE) was a straightforward process. “Like many firms, we have a three-year technology refresh cycle. And last year, with a big push for private cloud, we looked at many things and decided 10 GbE would be an important enabler for those increased bandwidth needs."

10 Gigabit Ethernet technology: A viable option for SMBs?
http://searchsmbstorage.techtarget.com/news/2240079428/10-Gigabit-Ethernet-technology-A-viable-option-for-SMBs


My team built a Hyper-V grid in 2007-2008 that
worked rather nicely at 1 Gbps speeds.We assumed 80% capacity on a network link, a density of 4:1, and an average of 20% (~200 Mbps) per vm. In operation, the spec was close. We had a "server as a Frisbee" model that meant non-redundant networking. This wasn’t a concern because if a Hyper-V host failed (3% per year) it only impacted up to four hosts (%2 of the environment) for about a minute.

When designing the new Hyper-V grid in 2010, we realized this bandwidth was no longer going to cut it. Our working density is 12:1 with our usable density of 40:1. That meant 2.4 Gbps to 8 Gbps per node. Our 2010 model is "fewer pieces, higher reliability" and that translates into redundant network links. This was more important when a good portion of our servers (10-15%) would be impacted by a link failure.

Let’s do a quick back of the napkin sketch. Traditional 1 Gbps Ethernet would require 10 primary and 10 secondary Ethernet connections. That’s ten dual 1 Gbps adapters: 10 x $250 = $2,500. That’s twenty 1 Gbps ports: 20 x $105 = $2,100. Then there’s the time and materials cost for cabling all that up. Let’s call that $500. By contrast, one dual port 10 GbE adapter  is $700. We need two 10 GbE ports: 2 x $930 = $1,860. We need two cables ($120/per) plus installation. Let’s call that $400.

The total cost per Hyper-V host for 10 GbE is $2,960. Compared to the cost of 1 Gbps ($5,100), we are looking at a savings of $2,140. For higher density Hyper-V grids, 10 GbE is easily cost justified.

It took some engineering and re-organizing. We have been able to squeeze quite a bit of functionality and performance from the new technology. Cost savings plus enhancements? Win.

Tags:

Architecture | Systems Engineering | Virtualization

Cisco's new business tablet

By wolfgang. 1 July 2010 03:53

Perhaps another step forward towards disposable end-point tablet computing. (Wow, that was a mouth full). I would be interested in piloting the Cisco Cius coupled with VDI.

"Cisco announces that it will be launching an Android-based tablet next year named the Cius, aimed squarely at the business market."

http://feeds.wired.com/~r/wired/index/~3/nvydbm03px8/

Tags:

Architecture

A prediction on cloud computing adoption

By wolfgang. 21 December 2009 09:21

I am making a prediction on how the small-medium business market will adopt cloud computing. That’s a risky business, predicting. But we are rolling into a new year, so the time feels right.

My premise is this: adoption of cloud computing will mirror the adoption of virtualization.

The first wave will be infrastructure: development, test environments, backup, disaster recovery. These are not your line-of-business apps. These are not tier 1 apps. These are the solutions that allow an IT team to cut their teeth and learn with minimal risk to the organization’s mission.

The second wave will be point solutions. These are IT solutions to business problems. Need a workflow app? Need a point-and-click reporting solution? Turn to the cloud. These can be considered tier 3, maybe even tier 2. Still, these are not line-of-business apps. These solutions allow an IT team to add value in the business with their cloud savvy knowledge.

This will inevitably lead to a wide range of technologies and vendors. Someone will call this cloud sprawl, and set of the third wave. Consolidation of existing solutions under one cohesive framework. At this point, the bumps will be smoothed over. The technology will be proven. IT teams and businesses will then seek to move tier 1 and line-of-business software to the cloud.

The time frame for this shift will be 3-5 years. My thought is this will play out like other game changing IT solutions. The pace will be set by the organization weighing cost savings against risk. The vanguard will be IT teams that progress thru the first and second waves, before the third wave comes and swamps the ship.

For IT teams, the trick is to build the in-house expertise and keeping cost competitiveness with public cloud solutions. For businesses, the trick is to ensure that IT solutions are proceeded with based on cost savings and value propositions, rather than based on hype. For everyone involved, this will be an interesting 3-5 years.

Tags:

Architecture

Disposable end-point model

By wolfgang. 26 May 2009 03:52

One project in my portfolio at the moment is building what I call a disposable end-point model. It is a low priority project, but an ongoing one. The goal is to deliver the best user experience at the lowest price-point.

Portability is a must. Think about the concerns over swine flu and the like. What is your pandemic plan? My pandemic plan, at least from a technology standpoint, is straightforward. People work from home over the vpn and run apps from Citrix. So the end-point devices must be portable and dual-use.

Yet traditional notebooks are expensive. My firm, like most, has an inventory of aging notebooks. These older computers are costly to maintain (studies show ~$1K per device per 2 years) and replace if lost or stolen (studies show ~$50K per incident).

The sweet spot are computers that are cheaper than supporting aging devices and disposable if lost or stolen. No local data means no security incident, which erases the risk exposure of stolen devices. These inexpensive computers should be light-weight and easily ported from office to home. So I am looking at netbooks, which run around $500.

I spoke with Jeff Vance, Datamation, about these ideas. He recently wrote an excellent article that summarizes the netbook market and how data center managers are looking to use the devices: Will Desktop Virtualization and the Rise of Netbooks Kill the PC?

Tags:

Architecture | Security | Virtualization

Open Up and Lock Down

By wolfgang. 13 March 2009 04:31

Today's networks balance opening up with locking down. The model perimeter, with a single access gateway protected with a firewall, is quickly disappearing. All end-points should now run their own firewalls. All hosts (particularly high valued servers) should now be bastion hosts. Access across the network should be locked down by default, and then opened up only for particular services.

I think we see this change reflected in several trends. The ongoing focus on detection controls over defensive controls is because modern networks have a significantly broader attack surface. Last year's focus on end-point security was about making computers bastion hosts. Risk management and governance is a hot topic now and it seeks to understand and protect business networks in their entirety, end-to-end.

I can only use my own firm as an example. We have some 17 dedicated connections coming in from partners and exchanges. We have five inter-office connections. We have 6 perimeter firewalls, or 7 if you include the Microsoft ISA server. All servers are running a host firewall and are locked down. All this so we can gain access to the resources of partners and vendors, and to provide resources to partners and clients. And this is in a relatively small company with less than 200 employees. Imagine the complexity of mid-sized and enterprise networks.

Open Up. Collaborate and succeed. Lock Down. Secure and protect.

J Wolfgang Goerlich


The eroding enterprise boundary: Lock Down and Open Up
http://www.theregister.co.uk/2009/03/12/eroding_enterprise_boundary/

IBM Security Technology Outlook: An outlook on emerging security technology trends.
ftp://ftp.software.ibm.com/software/tivoli/whitepapers/outlook_emerging_security_technology_trends.pdf

Tags:

Architecture | Security

Security is Design

By wolfgang. 2 January 2009 14:05

Welcome to 2009, and welcome back to my blog. This year’s focus is on using network architecture to create information security.

I come to this after reading some reports from Gartner Group: Three Lenses Into Information Security; Classifying and Prioritizing Software Vulnerabilities; and Aligning Security Architecture and Enterprise Architecture: Best Practices.

The first report posits that designing or architecting security is one of three lenses thru which to view InfoSec (the other two being process-focused and control-focused). Why this emphasis on architecture? The primary reason is that most vulnerabilities are not within the software themselves, but within your implementation.

“Gartner estimates that, today, 75% of successful attacks exploit configuration mistakes.” Furthermore, few of us have the skills, time, and license to modify the software to address the remaining 25% of the vulnerabilities. Thus the largest positive impact an InfoSec professional can have on security is thru planning and architecting the system design.

The secondary reason is that retrofitting system architectures with security after the fact is time intensive and service invasive. It often requires stopping work during the change implementation. It may require altering the work after implementation. This has a tangible cost. Gartner puts it thusly: “The careful application of security architecture principles will ensure the optimum level of protection at the minimum cost.”

The bottom line is that emphasizing security architecture in the original design minimizes costs and vulnerabilities.

Tags:

Architecture | Security

    Log in