What the h#@k is Software Defined anyway?

Published on: 28 July 2015
  • By Loïc Calvez
What the h#@k is Software Defined anyway?

Software Defined reminds me of Cloud in the early beginnings, everybody talks about it, but no one is saying the same thing! Therefore I decided to try to shed some light on the topic today. When I ask that question I often the generic definition: “the separation of the data plane from the control plane” or even simpler: “the virtualization of everything else”. In some way, that’s true, but what does it mean?

First let’s go back a little bit to a place better understood: compute virtualization. What it did for us: it allowed to dissociate the VMs/Workload from the underlying hardware which brought us a ton of benefits: simpler hardware upgrades, better hardware utilization, reductions in operating costs, flexibility of operations… It also brought some constraint: software vendor lock-in (early on, less now), increased complexity. More fundamentally, it allowed to manage our environments differently, we could focus more on what we were running and less on where we were running it.

Now the same thing is happening with storage and networks (and according to some vendors with everything else (cooling, power…), but I won’t go there today). More and more “decisions” are moving from the hardware to the software. For example on the network side, a decision that needs to be made on a data packet (example: is it allowed to flow there) is now getting decided by a central control instead of being decided locally by the hardware device, the device still execute the “order”, it may even cache it for a while and not ask the control body again, but the “intelligence” is no longer in the hardware device. This is the separation of the control plane [decision] from the data plane [execution]. The virtualization angle comes from the fact that the underlying impact of the hardware itself is greatly diminished since many Software Defined “control” solutions are fairly hardware agnostic (they only need it to meet a certain sets of standards).

Quick recap of the “what it is”: the decisions on what happens are now taken outside of the hardware (and to be real clear, you still need good hardware, it can just be less intelligent).

Why

This is the big question: why would someone want to embrace Software Defined? There are a lot of great reasons to look into it! I will explain some of the more common ones:

First, we are now able to take decisions based on a larger set of data. Instead of deciding within a single device if an action can be taken, we can now make the decision based on the entire environment (for example how to route a packet from A to Z or if user X should have access to this data).

Second, automation gets a lot easier in software defined. There are two main reasons around this:

  • Software Defined platforms were created from their inception to allow integrations, so [standard-ish] APIs exists to allow control of their behaviours and decisions (most legacy hardware devices did not allow this and at best it was CLI driven, not API driven making it much harder to automate)
  • With a central control, you have less devices to interact with, so programming is easier. If I want to allow a new IP in a new VLAN to communicate with another one, I can create that VLAN/IP/Rule once instead of having to publish it on dozens of devices from multiple vendors.

Third: Because of all this, it allows us to do things that were previously impractical. For example, network micro-segmentation of all workloads. Managing this in a changing Hardware Defined environment becomes a nightmare quite rapidly, but in a Software Defined world, this becomes feasible since new rules can be created centrally (automatically) and enforced dynamically.

Fourth: new features are often way easier to roll in. In the hardware world, when a new feature came out, you needed to refresh to the next generation, which depending on your cycles could take 3-5 years. In a software world, these can be rolled in all the time. No need to wait for the next refresh cycle.

Finally: It can completely change the way we manage environments (in most cases, this is a good thing). We can now move in a world of policies. Instead of creating rules in the underlying environment that get enforced blindly on the operating environment above it, you can now create policies that can be application/workload specific (in a not so distant world, those policies will also be able to follow their workloads across multiple environment/cloud/locations).

But, what about software vendor lock-in?

Right at this moment, it will kind of happen. The good news is that software changes faster than hardware, so new features you are missing or bugs you are facing can be solved within what you already owned. This is where finding the right vendor is important, you want someone whose vision aligns with yours. The better news is that the industry and its standards are evolving faster than ever, for the same reason that early hypervisors were a source of lock-in initially and now are not, the same thing will happen with Software Defined. Initially, your software solution will limit which hardware you can use, but gradually most hardware will work. Then, you central control will be able to communicate with other control solutions to allow a more seamless experience (allowing rules/polices to carry across environment/cloud/locations).

Conclusion

Software Defined is great and will eventually become the norm. If you have a strong need for automation or a peculiarly complex set of policies, you may want it right now. If not, it is definitely something you want to keep on your radar as you refresh your environment and select vendors/technologies.

As always, all feedback are welcome! Follow me on LinkedIn, Loic Calvez. What else would you like to hear about?

Subscribe and Get The Latest News

Related Posts

We are thrilled to announce that ALCiT has successfully achieved SOC 2 certification, a significant milestone that underscores our dedication to maintaining the highest Cybersecurity and data...
We have been getting this question more often lately and that is a great sign that people are starting to understand risks and exploring solutions. In this blog, we will dive into the pros and cons...
The first step in our process is "Assess" which aligns closely with the Five Functions of NIST. The main two reasons we start here are simple: 1: you can't protect what you don't know you have, 2:...