Webinar – How to set up a collaborative Workspace and Endpoint for your users within minutes with Tehama
VMware Workload Security. Carbon Black on vSphere. A proactive approach for more effective risk management
Discover in video what the Tehama solution is, and how it will revolutionize your approach to the virtual workspace
VDI Challenges and How to Solve Them
In the first blog in this series, we talked about the main advantages of VDI and Desktop-as-a-Service (DaaS), including centralized management, improved security, and ability to support bring-your-own-device (BYOD). This time, I want to focus on common problems that arise with VDI. We’ll look at three big VDI challenges—poor user experience, solution complexity, and high costs.
Challenge 1: Poor User Experience
There’s nothing that torpedoes a VDI project faster than failure to deliver a great user experience. There are a couple of facets to this: addressing user requirements and delivering optimal performance.
Your VDI project won’t succeed if you don’t understand your end-users’ requirements. Get your user communities involved up front to make sure you address their needs. Good communication and managing expectations are critical. Delivering blazing performance won’t make much difference if your VDI solution fails to deliver important functionality, and yet a third of VDI projects are deficient in this regard.
Another important reality is that almost half of VDI projects have performance issues. While performance needs may seem clear cut, once again there’s no replacement for understanding the user perspective. Responsive desktops and applications are important, but may not be the only thing that matters. For example, in healthcare settings where clinicians move from location to location, login time is a performance metric with high visibility and impact. And don’t overlook the value of predictability. If your VDI service delivers great performance most of the time, but with occasional unexpected slowdowns, user satisfaction will suffer.
The administrative experience also plays a role in your company’s overall satisfaction with a VDI solution. If the administration is complex, your team is likely to make mistakes that affect users and may delay or avoid performing necessary tasks like patches and upgrades. Ideally, you don’t want VDI admins to be completely dependent on server, storage, and networking teams for infrastructure-related tasks.
Challenge 2: Solution Complexity
There are a number of factors that contribute to the complexity of a VDI project. Satisfying stringent feature and performance requirements plays a role, and so does addressing availability and security needs. Ensuring that a large VDI environment can fail over quickly can consume a lot of time and effort for planning, testing, and monitoring.
VDI projects often take months or even years to go from proof of concept to full production. Architecting a solution that will scale as you add seats adds complexity, and there are significant risks associated with getting the design wrong. A VDI environment that’s under-spec’d will create big headaches later; a design that’s too large increases upfront costs.
Building your VDI environment from scratch using traditional three-tier infrastructure with separate servers, networks, and storage also contributes to complexity. It can be challenging to balance bandwidth and capacity requirements across servers, storage systems, and networks to avoid unexpected bottlenecks, and you’ll need multiple tools to manage and monitor the various resource silos.
Challenge 3: High Costs
I’ve already touched on many of the elements that add to the cost of a VDI project. Stringent feature and performance requirements from end-users, high availability needs, and extended deployment, scaling, and upgrade cycles all play a role.
If you build your VDI environment using traditional infrastructure, over-provisioning the environment is almost a necessity. Storage systems must be over-provisioned initially to ensure that you’ll be able to support more seats later, adding significantly to your upfront costs. Even with over-provisioning, it can be difficult to predict when storage will become a bottleneck, leading to a flood of trouble tickets and unanticipated costs for scaling and re-architecting.
A traditional environment also adds management complexity that increases OpEx and slows down upgrades, troubleshooting, and other important tasks as requests pass from team to team.
Finally, if you’re going to run a hypervisor like VMware ESXi, the licensing costs for a large VDI installation add up quickly, becoming a significant percentage of your overall costs.
Addressing VDI Challenges with HCI
Since this is a Nutanix blog, it won’t surprise you that we believe that hyperconverged infrastructure (HCI) is a key part of the solution to these VDI challenges—but there are good reasons for that. VDI was the original “killer app” for HCI. If you ask a VDI expert, or approach any infrastructure vendor (including one that sells both HCI and traditional solutions) they are likely to steer you to HCI for your VDI project. In the next blog in this series, we will explain some of the features of Nutanix HCI that make it particularly well suited for VDI.
It’s also worth asking the question whether Desktop-as-a-Service can solve these VDI challenges. The short answer to that question is “yes,” assuming you’ve done your homework to understand user requirements. We will dig into DaaS in more depth later in this series.
Why Should You Care About VDI and Desktop-as-a-Service?
It’s been a long-running joke in the IT industry that every year is going to be “the year of VDI,” the year when IT organizations finally recognize the full value of virtual desktops and applications and adoption accelerates.
But while industry watchers have been laughing, use of VDI—and its cousin Desktop-as-a-Service (DaaS)—in the real world has been rising steadily. According to Gartner, the global market for infrastructure to run VDI software in 2019 is $7.58B, and the VDI/DaaS software market is $5B. And spending is growing more than 11% annually—much faster than IT spending as a whole.
VDI and DaaS are two of the primary technologies in the end-user computing (EUC) stack. If your organization has been contemplating either adopting EUC technologies or expanding an existing deployment, you’re probably wrestling with some tough decisions.
This blog—the first in a series dedicated to VDI and DaaS—addresses why companies should consider adopting these technologies in the first place. Later blogs in the series will dig into the specifics of VDI and DaaS—including pros, cons, and important deployment considerations for each—to help you make more informed decisions.
So there’s no confusion, in this series when we talk about VDI and DaaS, we are referring to the full spectrum of technologies for delivering virtual desktops and/or virtual applications. This encompasses well-known solutions such as Citrix Virtual Apps and Desktops (formerly XenApp and XenDesktop), VMware Horizon, and Microsoft Remote Desktop Services.
Why Do Companies Adopt VDI and DaaS?
Successful EUC projects typically deliver benefits for end-users and IT teams—and for the company as a whole—such as:
- Simplifying application and desktop management
- Improving data security and availability
- Enabling BYOD or CYOD
- Simplifying and accelerating onboarding/offboarding
- Enhancing the end-user experience and increasing productivity
Companies implementing VDI or DaaS are often hoping to drive down CapEx and OpEx. However, if that’s your only goal, you may be disappointed. Here are three of the main reasons why organizations should consider VDI or DaaS.
Reason 1: Centralize Desktop and Application Management
Every organization has end-users that need access to apps and data to get work done and be productive. Supporting these end-users has negative impacts on IT in terms of cost, complexity, and time required. Both VDI and DaaS can centralize management and reduce these impacts, making it easier to manage a large number of desktops and end-user applications, thereby simplifying daily operations.
With physical desktops and laptops widely distributed across the company, tasks like installing new applications and managing, patching, and updating operating system software are extremely difficult. It’s almost impossible to keep everything current. With VDI and DaaS, these tasks are managed and executed centrally; processes can be automated to ensure your environment stays up to date. If a problem arises, it is much easier to roll back to a known-good configuration.
With DaaS, the platform to deliver applications and desktops is part of the service so much of the work is done by the provider, further reducing the administrative burden and lowering risk. In most services, your IT team remains responsible for managing specific user applications.
When a new user comes onboard, with traditional systems it often takes a long time to procure, configure, and deliver a suitable device with the necessary applications. With VDI and DaaS, new users can be onboarded much faster and new applications can be provisioned quickly, so users are immediately productive.
When a user changes roles or leaves the company, he or she can be de-provisioned just as quickly. This is ideal for modern businesses, especially companies that have large seasonal swings in their workforces. It is also extremely valuable during mergers and acquisitions.
Reason 2: Enhance Security and Protect Intellectual Property
For many IT teams, device management is a far smaller concern than the security risks created by hard-to-control devices with stored company data. According to a recent Forbes article, “Nearly 41% of all data breach events from 2005 through 2015 were caused by lost devices.” VDI and DaaS enable end-users to access company data without the need to store anything locally on the device, eliminating the risks associated with a lost or stolen laptop.
- User applications no longer need to be installed or run locally on each device
- Company data remains in your data center or the cloud where it is much more secure
- If a device fails, the user can simply switch to a different one and pick up where they left off
Often, a VDI or DaaS solution is configured to be stateless; the operating system and user applications are always pristine on every login, while user data and user settings are stored and secured centrally.
Another advantage of both VDI and DaaS is that you can control access using modern authentication and authorization technology—including multi-factor authentication and context awareness. Enforcing stricter rules makes it more difficult for outsiders to gain unauthorized access.
Reason 3: Enable BYOD or CYOD
Gone are the days when organizations could hand out a standard black laptop or beige desktop and expect everyone to be happy. End users are increasingly mobile and want greater choice in the devices they use, making choose-your-own-device (CYOD) and bring-your-own-device (BYOD) initiatives increasingly common. However, giving users direct access from uncontrolled personal devices is a recipe for disaster. By pairing BYOD or CYOD with a VDI or DaaS solution, you can give your end-users the freedom to work from any location with almost any device—without creating unacceptable security risks.
Deliver a Better Experience for Everyone
VDI and DaaS solutions can make end-users more mobile and more productive, reduce the risk of data and intellectual property loss, and make IT operations more efficient. If you’ve been putting off a planned VDI or DaaS project, now’s the time to get started.
However, you’ve probably already heard more than one horror story about a VDI project going off the rails. Next time, we’ll look at the challenges that stand in the way of VDI success and discuss ways to address those challenges.
 Gartner Infrastructure Software Forecast 2019 Q1, Forecast Analysis IT Spending Report 356328, and IT Key Metrics Data 2019 Report 375647
Is DNA the future of data storage?
Humanity’s digital storage needs are constantly increasing. While the amount of data produced in 2010 was only 2 zettabytes (1 zettabyte = trillion gigabytes), it has been multiplied by more than 32 times in 10 years to reach 64.2 zettabytes in 2020. And it’s far from over: this figure could reach 180 zettabytes in 2025, an increase of around 280% in 5 years.
Storage media are therefore naturally led to evolve, from the magnetic tape of the 1930s to the SSD disk, becoming smaller and smaller & more and more efficient. Until their latest avatar: DNA, yet billions of years old.
According to numerous studies, the deoxyribonucleic acid molecule which contains our genetic information and that of all living organisms on Earth, could be the perfect solution to store all the cold data. Cold data is the data that is rarely accessed but is considered highly valuable, such as archives.
The principle behind DNA storage is simple: binary digital data (0 or 1) are converted into nucleotides (the 4 molecules of DNA: A, C, G & T). The DNA is then produced by dedicated machines & stored in an aqueous solution.
The benefits of DNA over current storage methods are quite compelling:
- Solidity and durability: DNA can with stand extreme weather conditions, whereas current physical media are much more fragile. When stored in the proper environment, DNA can be decrypted even after millions of years.
- Energy-efficient: Today, Data centers consume 2% of global electricity.
- Small in size: This is one of the most fascinating properties of DNA. Data centers occupy an ever-increasing volume of 167 km2 worldwide. On the opposite, DNA has an exceptional capacity to densify information. While the nucleus of a cell in our body measures less than 10 micrometers, the DNA it contains would measure almost 2 meters. Data storage has evolved in such a way that, previously we could store more than 100 DVDs on a single SD card, then 2 years of music on an USB, but with DNA you could fit all the world’s data in a space equivalent to a shoe box.
The idea of using DNA as a storage medium is not entirely new. Richard Feynman (Nobel Prize winner in Physics) had already formulated it in 1959. But it was not until 2012 that the first technical tests were carried out by Harvard teams. Today, initiatives are flourishing to make this technology feasible. From start-ups to large companies, including university research groups, a lot of organizations are working on this. Major technological leaps have been made, widening the field of possibilities.
But there are still many obstacles before the industrialization of this storage method happens. Production costs and processing time remain too high for industrial use. We will have to wait until 2030 to see the impact of this promising technology in our lives…