The backbone of the corporate IT system is the network, which has become more important than ever in an increasingly connected world.
Research findings indicate that, perhaps surprisingly, corporate networks will both expand and consolidate in the year ahead. Far from being at odds with one another, these two concepts will have to work in unison.
Networks will expand in scope due to increases in sophistication and the need to support an array of next-generation applications - such as data and video applications that reach across the divides between television, personal computers and mobile devices.
And networks will consolidate due to the increasing need to enhance efficiencies and drive down costs.
Virtual trickery
Among the technologies supporting these trends is 'virtualisation' which has been applied to many different aspects of computing - from individual components to entire networks.
Virtualisation has been described as a technique for hiding the physical characteristics of computing resources from the way in which other systems, applications or end-users interact with those resources.
In a virtualised world, even the definition of 'the network' is different, and requires new and innovative defences.
Andy Robb is CTO of Duxbury Networking.
From a technical perspective, virtualisation creates an external interface that hides an underlying implementation. It achieves this by multiplexing access, by combining resources at different physical locations, or by simplifying a control system.
For example, virtualisation is able to make a single physical resource (such as a server, an operating system, an application or storage device) appear to function as multiple logical resources.
Conversely, it can also make multiple physical resources (such as storage devices or servers) appear as a single logical resource.
And virtualisation can also include making one physical resource appear, with somewhat different characteristics, as one logical resource.
Sharing savings
Against this background, virtualisation is driving the consolidation of hardware devices in a logical environment known as the 'real-time infrastructure'. This is a significant step towards reducing the amount of hardware on the network.
The benefit of this is the reduction in management overhead and cost associated with the support of many devices previously needed.
We can expect the network of the future to coalesce around shared services, re-use of services and moves towards improved, streamlined security, underpinned by the segmentation of processes - and data used by shared services.
The network of the future will also feature significantly more powerful monitoring, management and security services.
Security worries
Virtualisation will have a massive impact on network security.
Virtualisation detractors say reducing the number of devices on the network could increase the security risk as fewer devices would need to be 'cracked' by potential hackers.
But the converse is true; protecting one device is far easier than protecting many devices.
But security concerns run much deeper than this.
In order to grasp the problems, it's important to understand that security functions are different in an on-demand, virtualised world which cannot cope with the static, inflexible reality of physical security architectures.
Are organisations going to have to leap security hurdles on their way to realising their virtualisation goals, or are they going to find new, more dynamic paths to total security?
In a virtualised world, even the definition of 'the network' is different, and requires new and innovative defences.
For example, most network defences are built on the basis that they will be able to 'see' traffic which can be scanned and filtered for malicious content from a variety of perspectives - such as packet comparison or packet behaviour.
What if the traffic can't be seen? How then can remedial action be taken?
One suggestion is that if the traffic is invisible, a network-based approach to work within the virtualised server must be implemented. In other words, inter-process communications must be monitored from within virtual machines or between a virtual infrastructure that spans multiple physical machines on the network.
Adapt and evolve
Practically, some of the accepted security concepts will have to be abandoned. For instance, IP addresses will no longer identify servers because servers can be redeployed on-the-fly to a different sub-network.
In this case, the challenge is to provide a form of dynamic virtual security to dynamically allocated virtual servers.
Key to security virtualisation is making the security infrastructure (hardware and software) adaptable enough and sophisticated enough to successfully exist within a virtualised environment.
More importantly, just as virtualisation will facilitate changes to the network to mirror business requirements at any given time, so resource allocation must be flexible enough to allow security systems to keep pace with these changes and evolve to meet specific threats as they are identified.
* Andy Robb is CTO of Duxbury Networking.
Share