Blog

June 16th, 2017

Business owners stand to gain a lot by taking the time to understand emerging IT trends. In the case of containers, it’s an opportunity to reduce costs, increase hardware efficiency, and improve data security. One of the best ways to learn about containers is to address the misconceptions about them.

Containers are made up of the bare minimum hardware and software requirements to allow a specific program to run. For example, if you want to give employees access to a single Mac-based server application, but everything else you run is in Windows, it would be a waste to build a new machine for just that program. Containers allow you to partition just the right amount of hardware power and software overhead to run that Mac program on your Windows server.

Misconception #1: There is only one container vendor

Traditional virtualization technology -- which creates entire virtual computers rather than single-application containers -- has had two decades for vendors to enter the market and improve their offerings. Containers, however, didn’t break into the mainstream until a few years ago.

Fortunately, there are still more than enough container vendors. Docker dominates the industry and headlines, but there are at least a dozen other programs to choose from.

Misconception #2: Containers require virtualization

In the early days, containers could only be created and managed in the Linux operating system. This meant complicated and sometimes unreliable improvisation was required to benefit from container technology on Windows and Mac servers.

First, you would need to virtualize a full-fledged Linux install on your Windows or Mac server, and then install container management inside of Linux. Nowadays, container management software can run on Windows and MacOS without the confusing multi-layer systems.

Misconception #3: You can’t create and manage containers in bulk

Separate programs, known as Orchestrators, allow you to scale up your use of containers. If you need to partition more hardware power so that more users can use a container, or if you need to create several identical containers, orchestrators make that possible.

Misconception #4: Containers are faster than virtual machines

Obviously, virtualizing an entire operating system and the hardware necessary to run it requires more management and processing requirements. A lot of people tend to think this means containers are faster than virtualized machines. In reality, containers are just more efficient.

Accessing a container is as simple as opening it and using the application. A virtualized machine, however, needs to be booted up, a user needs to log in to the operating system, and then you can rummage through folders to open an application. Most of the time containers are faster, but there are instances when that's not true.

Virtualization and containers are complicated technologies. For now, just remember that 1) Virtualization and containers are separate technologies, each with pros and cons; and 2) you have plenty of software options to manage containers (sometimes in bulk). For anything more specific than that, give us a call!

Published with permission from TechAdvisory.org. Source.

May 31st, 2017

Mobile device management is a full-fledged subset of IT security. Employees store and view sensitive data on their smartphones, which exposes your organization to a significant amount of risk. Although there are plenty of great solutions for managing this, the National Security Agency (NSA) believes mobile virtualization is the next big thing.

US government approved

The NSA maintains a program named Commercial Solutions for Classified (CSFC) that tests and approves hardware to assist government entities that are optimizing security. For example, if a public sector network administrator is deciding which mobile devices to purchase for office staff, CSFC has information about which devices are approved for various government roles.

Offices in the intelligence community usually require virtualization hardware and software as a minimum for laptops and tablets. But until now, no smartphones that included the technology have passed the tests. However, a recently released model of the HTC A9 phone includes mobile virtualization functionality that got the green light.

What is mobile virtualization?

Virtualization is an immensely complicated field of technology, but when it comes to mobile devices the process is a little simpler. Like any mobile device management plan, the goal of mobile virtualization is to separate personal data from business data entirely. Current solutions are forced to organize and secure data that is stored in a single drive.

Essentially, current phones have one operating system, which contains a number of folders that can be locked down for business and personal access. But the underlying software running the whole phone still connects everything. So if an employee downloaded malware hidden in a mobile game, it would be possible to spread through the entire system, regardless of how secure individual folders are.

With mobile virtualization however, administrators can separate the operating system from the hardware. This would allow you to partition a phone’s storage into two drives for two operating system installations. Within the business partition, you could forbid users from downloading any apps other than those approved by your business. If employees install something malicious on their personal partition, it has no way of affecting your business data because the two virtualized operating systems have no way of interacting with each other.

Although it’s still in its infancy, the prospect of technology that can essentially combine the software from two devices onto a single smartphone’s hardware is very exciting for the security community. To start preparing your organization for the switch to mobile virtualization, call us today.

Published with permission from TechAdvisory.org. Source.

May 17th, 2017

Virtualization is becoming so popular that businesses are finally seeing the technology being packaged with the solutions they already know and love. The Windows operating system is a great example of a solution that grants you access to features like containers and software-defined storage.

A brief history of Windows Server

The Windows Server operating system has been around for decades. As an advanced option for onsite servers, this operating system grants access to high-level access management settings, DNS customizations, and network configuration management. In fact, it’s such a complicated solution that Microsoft offers certification courses for each version of the operating system.

The most recent iteration of this operating system is Windows Server 2016 (WS16). Released on October 12th, 2016, Microsoft’s latest server software included countless improvements to its networking and user management features. Where it really shines however, is in the ways it handles virtualized computing.

Virtualization in Windows Server 2016

As with just about anything in the virtualization world, containers dominate the WS16 conversation. Containers use software to aggregate the bare minimum requirements that one application needs to run -- hardware, software and operating system -- and deliver that package across a network to computers that lack one or more of those requirements. For example, if you want to run a Mac application that requires a huge amount of processing power on a bare-bones Windows workstation, you can create a container with the necessary components on your server and let the workstation access it remotely.

WS16 users have access to two types of container deployments: Hyper-V and Windows Server containers. To the average business owner, the differences between these two options is minute, but what is important is Microsoft’s commitment to compatibility. If virtualization is important to you, choosing WS16 is a great way to ensure that you’ll be ready for whatever develops among the disparate providers.

Another great virtualization feature in WS16 is software-defined storage (SDS). It’s a complicated solution, but it essentially allows you to create hard drive partitions outside of the confines of hardware limitations. You can create a single drive by pooling storage space from three different servers, or you can create several separate drives for virtualized workstations to access.

Obviously, managing a server is no easy task -- regardless of whether or not you implement a virtualized infrastructure. That complexity comes with some compatibility issues; if your business relies on old software, it may not have been updated to run with WS16. For everything from creating a transition plan to managing your virtualized framework, give us a call today.

Published with permission from TechAdvisory.org. Source.

April 28th, 2017

Malware is becoming more sophisticated every day, and we recommend several solutions for dealing with it. One of the most interesting of these is achievable via cutting-edge virtualization technology. Often referred to as sandboxing, this solution is a great way to quarantine and test suspicious applications before exposing them to your entire network.

What is sandboxing?

Sandboxing is one of the rare concepts in virtualization that the average person can usually grasp in just a couple short sentences. Essentially, sandboxing is the practice of tricking an application or program into thinking it is running on a regular computer, and observing how it performs. This is especially useful for testing whether unknown applications are hiding malware.

Obviously, it gets far more complicated once you delve into the details of how you implement a sandboxing technique, but the short answer is that it almost always involves virtualized computers. The program you want to test thinks it’s been opened on a full-fledged workstation of server and can act normally, but it’s actually inside of a tightly controlled virtual space that forbids it from copying itself or deleting files outside of what is included in the sandbox.

An effective way to quarantine

Virtualization is no simple task, but the benefits of sandboxing definitely make the effort worth it. For example, virtualized workstations can essentially be created and destroyed with the flip of a switch. That means:
  1. You aren’t required to manage permanent resources to utilize a sandbox. Turn it on when you need it, and when you’re done the resources necessary to run it are reset and returned to your server’s available capacity.
  2. When malware is exposed inside a sandbox, removing it is as simple as destroying the virtual machine. Compare that to running a physical workstation dedicated solely to sandboxing. Formatting and reinstalling the machine would take several hours.
  3. Variables such as which operating system the sandbox runs, which permissions quarantined applications are granted, and minimum testing times can be employed and altered in extremely short periods of time.
This strategy has been around for nearly two decades, and some cybersecurity experts have spent their entire careers working toward the perfect virtual sandbox.

Containers: the next step in this evolution

Recently, the virtualization industry has been almost totally consumed by the topic of “containers.” Instead of creating entire virtual workstations to run suspicious applications in, containers are virtual spaces with exactly enough hardware and software resources to run whatever the container was designed to do.

Think of the metaphor literally: Older sandboxes came in a uniform size, which was almost always significantly larger than whatever you were placing into them. Containers let you design the size and shape of the sandbox based on your exact specifications.

Quarantined virtual spaces fit nicely into the sandbox metaphor, but actually implementing them is impossible without trained help. Whether you’re looking for enhanced security protocols or increased efficiency with your hardware resources, our virtualization services can help. Call us today.

Published with permission from TechAdvisory.org. Source.

April 13th, 2017

2017April13Virtualization_CWhether or not you understand virtualization, there’s a good chance you’ve never had a hands-on experience with a virtualized desktop. As one of the most basic applications of virtualization technology, network-based desktops are the perfect example of how businesses can benefit from any form of virtualization. Read on to test out an example desktop!

What is virtualization?

The simplest definition is this: It’s the act of creating a virtual (rather than physical) version of something, including hardware platforms, storage devices, and computer network resources. But that doesn’t do much for those outside of the IT industry.

We could paint a colorful analogy to try to better explain it, or we could let you paint with your very own virtualized demo. Follow these steps so you can see how virtualization works:

  1. Visit this website.
  2. Wait while your virtualized 1991 Macintosh boots up.
  3. Double-click the ‘Kid Pix’ desktop icon.
  4. Write “This is virtualization” on the blank canvas.
  5. Click (and hold) File, and select Save As.
  6. Click the Save button in the new window.
  7. Quit ‘Kid Pix’.
Voilà! Your picture was saved to that old-school Mac's virtual hard drive. That’s because everything -- from the operating system to the processor -- is running on a server located somewhere else on the internet. And it’s not just some remote desktop viewing trick, this ’90s-era Mac and its hardware have been created by software installed on a server that is concurrently processing a million other tasks.

It’s a fun demonstration, but modern-day virtualization can accomplish much more.

Divide up hardware resources

The dated nature of that machine actually helps us better illustrate the biggest benefit of virtualization. The software that lets us create virtual machines also allows us to define exactly how much hardware each workstation gets.

For example, this Mac has only 3.8 MB of hard drive space, but if your virtualization server has 10,000 GB of space, you can create 100 virtual desktops with 100 GB of storage space. It’s a bit of an oversimplification, but that’s essentially how it works with storage hardware, CPUs, RAM, and other hardware.

Reduce on-site costs

The bulk of your workstation and server hardware is usually hosted off-site, which means lower utility bills, computer equipment requirements, and maintenance overhead. Instead of patching and upgrading each workstation’s software and hardware individually, virtualization allows you to apply changes to all your machines at once.

Disaster recovery

If your virtualization server is hosted off-site, what happens when natural disasters, power outages, theft, or vandalism strikes your office? Or, as a simpler example, where did you store your Kid Pix masterpiece? Certainly not on the machine you’re reading this blog from.

Virtualization allows you to keep mission-critical data stored safely away from the office so your team can get back to work as soon as your IT provider gets them access to the server again. And with a single point of management (i.e., your off-site server), that can take place in virtually no time at all.

Ending your dependence on individual machines and their hardware is just one of the many ways to utilize the power of virtualization. You can define network hardware and configurations with software, run applications on any operating system, and so much more. To find out which solution is best for your business, call us today!

Published with permission from TechAdvisory.org. Source.

March 28th, 2017

For companies trying to create employee desktops stored on a server and delivered over the internet, security and optimization often entail a mess of vendors, settings, and updates. Thankfully, two of the biggest names in the industry just announced a partnership that aims to simplify the whole process.

For those who don't know, Azure is Microsoft’s build-it-yourself cloud platform. With more than 600 services, Azure is all about giving network administrators access to Microsoft data centers to pick and choose how your cloud is structured.

Citrix is one of the largest virtualization software providers on the market. And its most famous product, XenDesktop, was one of the very first software solutions to allow multiple users to access Windows from a networked desktop with a different operating system already installed.

Now compatible with Windows 10

With the recent release of XenDesktop Essentials for Microsoft Azure, these two solutions are becoming one. Administrators can now build fully-stocked Windows 10 desktops stored in Azure, and employees can access them from any machine with Citrix’s lightweight client installed.

The whole setup costs only $12 per user, per month, and comes with a host of administration settings for managing and monitoring your virtualized desktops and how users access them.

A better way to work

It’s like Azure is a moving truck, XenDesktop is the box holding all your stuff in the back of the truck, and your company applications and settings are what’s inside the box. With the right configuration, the whole box can be delivered to employee desktops anywhere in the world.

As long as employees are accessing virtual desktops from verified devices running MacOS, iOS, Android, or even an older version of Windows, they can work as if they are sitting right in front of the Windows 10 install located within your company’s cloud.

Virtualization is a wonderful solution for cutting costs and increasing efficiencies. Unfortunately, even with two of the most user-friendly vendors in their respective industries, virtualizing Windows 10 desktops is still a monumental task. For 24/7 access to support and expert advice, call us today.

Published with permission from TechAdvisory.org. Source.

March 10th, 2017

Modern web network and internet telecommunication technology, big data storage and cloud computing computer service business concept: server room interior in datacenter in blue light

We’ve talked about network virtualization on this blog before, but there was some recent confusion over whether or not that service is different from software-defined networking. It’s not a big difference, but no, they’re not the same. Let’s take a look at what sets them apart.

Software-defined networking (SDN)

Managing storage, infrastructures, and networks with high-level software is something IT technicians have been doing for a long time. It’s a subset of virtualization and it is one of the oldest strategies for optimizing and securing your IT hardware.

Despite its popularity, SDN does have one major drawback -- it needs hardware to do its job. SDN allows you to control network switches, routers, and other peripherals from a centralized software platform, but you can’t create virtual segments of your network without the hardware that would normally be required outside of an SDN environment.

Network virtualization

Evolving beyond SDN was inevitable. Whenever a technology can’t do it all, you can bet someone is working hard to fix that. Network virtualization uses advanced software solutions to allow administrators to manage physical hardware and to create virtual replicas of hardware that are indistinguishable to servers and workstations.

Network virtualization simplifies the field of network design. You can reduce spending on expensive hardware, reconfigure network segments on the fly, and connect physically separate networks as if they were in the same room.

A virtualized network may sound like an exciting technology that doesn’t have much use at small- or medium-sized business, but that’s exactly the beauty of hiring a managed services provider! We provide enterprise technology and advice as part of your monthly service fee. Call today to find out more.

Published with permission from TechAdvisory.org. Source.

February 21st, 2017

2017February21_Virtualization_CImplementing a virtualized data storage solution at your business is no small feat. It’s a complicated process that requires immense amounts of technical expertise. Unfortunately, getting it up and running is only half the battle. For the most efficient solution possible, watch out for the three most common management issues outlined in this post.

Poorly structured storage from the get go

Within a virtualized data storage framework, information is grouped into tiers based on how quickly that information needs to be accessible when requested. The fastest drives on the market are still very expensive, and most networks will have to organize data into three different tiers to avoid breaking the bank.

For example, archived or redundant data probably doesn’t need to be on the fastest drive you have, but images on your eCommerce website should get the highest priority if you want customers to have a good experience.

Without a virtualization expert on hand, organizing this data could quickly go off the rails. Ask your IT service provider to see a diagram of where your various data types are stored and how those connect to the software-defined drive at the hub of your solution. If there are too many relays for your server to pass through, it’ll be a slower solution than the non-virtualized alternatives.

Inadequately maintained virtualized storage

How long will your intended design last? Companies evolve and expand in short periods of time, and your infrastructure may look completely different months later. Virtualized data storage requires frequent revisions and updates to perform optimally.

Whoever is in charge of your virtualization solution needs to have intimate knowledge of how data is being accessed. If you’re using virtual machines to access your database and move things around, they need to be precisely arranged to make sure you don’t have 10 workstations trying to access information from the same gateway while five other lanes sit unoccupied.

Incorrect application placement

In addition to watching how your data is accessed as the system shifts and grows, administrators also need to keep a close eye on the non-human components with access to the system. Virtualized applications that access your database may suffer from connectivity problems, but how would you know?

The application won’t alert you, and employees can’t be expected to report every time the network seems slow. Your virtualization expert needs to understand what those applications need to function and how to monitor them closely as time goes on.

Deploying any type of virtualized IT within your business network is a commendable feat. However, the work doesn’t stop there. Without the fine-tuning of an experienced professional, you risk paying for little more than a fancy name. For the best virtualization advice in town, contact us today.

Published with permission from TechAdvisory.org. Source.

February 3rd, 2017

Modern web network and internet telecommunication technology, big data storage and cloud computing computer service business concept: server room interior in datacenter in blue light

Modern web network and internet telecommunication technology, big data storage and cloud computing computer service business concept: server room interior in datacenter in blue light

Virtualization is great, but it is awfully confusing. And although the whole point of hiring a managed services provider is to hand off these issues to the experts, we believe a cursory understanding is helpful for everything, including bragging to competitors and making executive decisions. The latest trend in virtualization is hyperconvergence, and it’s all about automating repetitive server tasks to make your infrastructure simpler and more efficient than ever. Let’s take a look at some of its benefits.

Using a hyperconvergence model to structure your network is very representative of the current trends in small- and medium-sized business technology. It’s about making enterprise-level solutions more accessible to those looking for a smaller scale. So although a lot of these benefits sound like the same points we argue for other technologies, let’s take a look at how they are unique to hyperconvergence.

Software-centric computing

It may not sound huge at first, but by packing everything you need into a single box, and wrapping that box with a flexible and adaptable management software, you empower your hardware infrastructure to receive more regular patches and updates. This makes it much easier to add more hardware later, or restructure what you’re currently using.

Unified administration

Hyperconvergence consolidates a number of separate functions and services into one piece of technology. Whoever is managing your virtualization services can tweak storage, cloud, backup, and database settings and workloads from one place.

Streamlined upgrading

Different hyperconvergence “boxes” come in different sizes and capabilities. So all it takes to scale up is buying another unit based on your forecasted needs. If you’re in a place where all you need is a little extra, purchase a smaller upgrade. But when you’re expecting rapid growth, a bigger box will ensure your IT can expand with your business.

Stronger data protections

Complexity is the achilles heel of most networked IT. When a small group of people are trying to stay on top of a mounting pile of account management settings, malware definitions, and data storage settings, it’s hard to keep constantly probing cyber-attackers from finding a security hole. But with a hyperconvergence infrastructure, your virtual machines aren’t built by bridging a series of third-party services together -- it’s all one service.

Keep in mind that while hyperconvergence is simpler than most virtualization solutions, it’s not so simple as to be managed by in-house IT departments at more small- and medium-sized businesses. The benefit of a more unified virtualization solution when you already have a managed services provider is the speed at which your growth and evolution can be managed.

The better your technology, the faster we can make changes. And the faster we can accommodate your needs, the less downtime you experience. Call us today to find out more about a hyperconverged system.

Published with permission from TechAdvisory.org. Source.

January 20th, 2017

2017January20_Virtualization_CBefore some of our clients have even had a chance to wrap their heads around what virtualization is and how it works, hackers have already started attacking the new and exciting technology. By updating a virus from several years ago for virtualized environments, hackers hope to totally wipe the data off your hard drives. If you utilize any sort of virtualization services, it’s imperative that you know how to steer clear of this threat.

What is it?

Back in 2012, a brand new virus called “Shamoon” was unleashed onto computers attached to the networks of oil and gas companies. Like something out of a Hollywood film, Shamoon locked down computers and displayed a burning American flag on the display while totally erasing anything stored on the local hard disk. The cybersecurity industry quickly got the virus under control, but not before it destroyed data on nearly 30,000 machines.

For years, Shamoon remained completely inactive -- until a few months ago. During a period of rising popularity, virtualization vendors coded doorways into their software specifically designed to thwart Shamoon and similar viruses. But a recent announcement from Palo Alto Networks revealed that someone refurbished Shamoon to include a set of keys that allow it to bypass these doorways. With those safeguards overcome, the virus is free to cause the same damage it was designed to do four years ago.

Who is at risk?

As of the Palo Alto Networks announcement, only networks using Huawei’s virtual desktop infrastructure management software are exposed. If your business uses one of those services, get in touch with your IT provider as soon as possible to address how you will protect yourself from Shamoon.

On a broader scale, this attack shows how virtualization's popularity makes it vulnerable. Cyber attackers rarely write malware programs that go after unpopular or underutilized technology. The amount of effort just isn’t worth the pay off.

Headlines decrying the danger of Shamoon will be a siren call to hackers all over the globe to get in on the ground floor of this profitable trend. It happened for ransomware last year, and virtual machine viruses could very well turn out to be the top security threat of 2017.

How can I protect my data?

There are several things you need to do to ensure the safety of your virtual desktops. Firstly, update your passwords frequently and make sure they’re sufficiently complex. Shamoon’s most recent attempt to infect workstations was made possible by default login credentials that had not been updated.

Secondly, install monitoring software to scan and analyze network activity for unusual behavior. Even if legitimate credentials are used across the board, accessing uncommon parts of the network at odd hours will sound an alarm and give administrators precious time to take a closer look at exactly what is happening.

Ultimately, businesses need virtualization experts on hand to protect and preserve desktop infrastructures. Thankfully, you have already found all the help you need. With our vast experience in all forms of virtualized computing, a quick phone call is the only thing between you and getting started. Call today!

Published with permission from TechAdvisory.org. Source.