IRVINE, CA – SEPTEMBER 12: Taco Bell’s iconic sauce packets. (Photo by Joshua Blanchard/Getty … [+] Images for Taco Bell)
Applications exist ‘on’ the cloud or on a server or smaller computer, it’s a fact of life. What that means in real terms is that software applications are developed from base code-level architectural engineering foundations, then tested, then taken through various other integration and provisioning steps… and then finally deployed to a state of live production where they can exist on one platform and infrastructure foundation or another. 
That platform and infrastructure foundation can of course be on a single machine. Or (more usually now) it can be on a terrestrial on-premises deployment inside an organization’s own datacentre (a server room), but it can also be a public cloud deployment hosted in a datacenter, or some hybrid combination of the two.
Despite the rise of low-code/no-code software application development and management shortcuts and accelerators, not many parts of this whole process are point-and-click simple. Working to provide more of what we can call the ‘application packaging’ end of the equation to get apps running effectively in modern computing environments is Liquidware. The company specializes in the monitoring and management of software for physical, virtual and cloud desktop deployments – it refers to itself as a digital workspace management specialist for Windows desktops.
Liquidware’s technology supports many desktop platforms, including Citrix Virtual Apps and Desktops, VMware Horizon and of course Microsoft Windows PCs. Liquidware CTO Jason Mattox explains the science behind application packaging by saying that enterprise organizations have many applications to support a variety of use cases for users.
While there are many operating systems, most businesses have standardized on delivering applications for users through Microsoft Windows, as it continues to be the dominant operating system for PCs and laptops.
Administrators challenged with delivering hundreds of Windows applications for their users have longed for a way to simplify application delivery i.e. the act of actually getting software applications onto the machines they are supposed to reside upon and ensuring they work.
Some of the modern application delivery techniques separate application management from the base operating system and core apps ‘image’ (i.e. the solid chunk of code that goes to make up the base of software for the application itself).
“This [above separation] approach greatly speeds application deployment while minimizing updates of a base image which usually contain the operating system itself and select core applications such as Microsoft Office. Other applications are usually assigned on a departmental or individual user need because they are specific for each user’s job,” said Mattox.
This is where technologies such as application virtualization or ‘layering’ shine through because they answer the need for assigning applications per department or user, while keeping base images to a minimum.
Application virtualization and application layering are two approaches to modern application delivery. Application virtualization often isolates applications to remediate aging apps while application layering offers a high degree of compatibility for apps since apps interact and feel native to the user, the operating system and other apps. Regardless of the technology, what these solutions have in common is the need to ‘package’ applications.
“When an application is packaged into a container for virtualization or layering, it is ready for testing and use as a production application by users. Since an application has to first undergo this process before the benefits of deployment and use can be realized, it’s in the best interest of application virtualization and layering vendors to make this process as straightforward as possible – and we have an option now to even automate the entire process,” explained Liquidware’s Mattox.
When an application is packaged, it is essentially containerized. While some solutions are more streamlined than others, the application is essentially ‘captured’ to a storage location. This may be a virtual hard disk locally or on a network or cloud location. Application packaging commonly uses ‘capture PCs’ or servers that are set aside for the specific purpose of application packaging. These are often virtual PCs or servers so that they can be reset through a backup or snapshot image with the goal of keeping the machine clean from multiple install/uninstall processes.
This is important because software installation often leaves files behind which may affect future application installations, even from unrelated vendors.
“An application to capture application installations is usually required on the capture PCs or servers. This software program enables the user to package an application by capturing its specific file installation locations and registry modifications. It could be described as ‘recording’ the actions of the installer so that it will know the locations of the files and registry keys captured during a normal installation process. The files and registry keys are saved to the container which may be a virtual hard disk,” said Mattox.
He further notes that these files and registry locations are later used to make the application look and react as if were installed in the PCs or servers that the application where the application is later deployed as a virtual or layered application.
“In the ever more dynamic world of cloud, containers and compute compartmentalization that typifies the construct of the modern IT landscape, application packaging forms an essential part of the wider discipline of application orchestration. As we now bring an increasing degree of automation intelligence to this ‘calm’ this process, we can streamline the processes through which we streamline application delivery to support changing business requirements,” said Rob Tribe, EMEA VP for systems engineering at Nutanix.
Nutanix’s Tribe concurs with many of the sentiments here expressed by Liquidware’s Mattox; this subject also encompasses areas such as centralized role-based IT governance to ensure that the right people get the right apps with the right data at the right time in the right use cases.
Enterprise applications don’t just ‘arrive’ on users’ devices, there is a whole raft of scaling and management that goes into the process as well. It’s all part of the packaging provisioning provenance that a good IT system needs to have its roots in.
Application packaging can be tedious, but some solutions have refined it to be very straightforward by making the process about the same as installing an app one time.
Looking to what happens next in this space, if an organization has hundreds of thousands of applications, some firms will now be looking to the use of technologies that can automate the packaging of applications so they may be packaged unattended with Artificial Intelligence (AI).
This approach saves time and money for organizations that have adopted application virtualization or layering technologies to streamline the delivery of their business applications.
What this entire process perhaps provides us with is some idea of the work going on behind the scenes that happens (generally before, but afterward too for ongoing application maintenance, patches, updates and upgrades) in order for us to get the apps we want in our pocket and on our desktops every day.

I am a technology journalist with over two decades of press experience. Primarily I work as a news analysis writer dedicated to a software application development ‘beat’;

I am a technology journalist with over two decades of press experience. Primarily I work as a news analysis writer dedicated to a software application development ‘beat’; but, in a fluid media world, I am also an analyst, technology evangelist and content consultant. As the previously narrow discipline of programming now extends across a wider transept of the enterprise IT landscape, my own editorial purview has also broadened. I have spent much of the last ten years also focusing on open source, data analytics and intelligence, cloud computing, mobile devices and data management. I have an extensive background in communications starting in print media, newspapers and also television. If anything, this gives me enough man-hours of cynical world-weary experience to separate the spin from the substance, even when the products are shiny and new.


Leave a Reply