I have already covered the importance of insights for EUC environments in some of my blog post. TL;dr of those is: if not having some kind of insight your screwed. As I find this a very important part of EUC and EUC projects, and see that insights are often lacking when I enter the ring…. I would like to repeat myself: try to focus on the insights a bit more, pretty please.
Main message: to successfully move and design a EUC solution, assessment is key as designing, building and running isn’t possible without visibility.
The assessment phase is made up of gathering information from the business, such as objectives, strategy, business non-functional and functional requirements, security requirements, issues and so on. And mostly getting questions from the business as well. This gathering part is made up in getting the information in workshops with all kinds of business and user roles, having questionnaires and getting your hands on documentation regarding the strategy and objectives, and some current state architecture and operational procedures explanation and documentation. Getting and creating a documentation kit.
The other fun part is getting some insights from the current infrastructure. Getting data about the devices, images, application usage, logon details, profiles, faults etc. etc. etc.
Important getting this data is the correlation of user actions with the subjects. It is good to know that when a strategy is to move to cloud only workspaces, that there will be several thousands steps between how a user is currently using his tool set to support the business process and the business objective. An intermediate step of introducing an any device desktop and hosted application solution is likely to have a higher success rate. Or wanting to use user environment management, and the current roaming profiles as bloated to 300GB. But I will try to not to get ahead with theories, get some assessment data.
Mining data takes time
Okay out with this one. Mining, or gathering, data takes time and therefore a chunk of project time and budget. Unfortunately with a lot of organisations it is either unclear what this assessment will bring, there are costs involved and a permanent solution is not in place. Yes there are sometimes point-in-time software and application reports that can be made from centralized provisioning solution, but these often miss the correlation of that data to the systems and what the user is actually doing. And the fact that Shadow IT is around.
Secondly knowing what to look for in the mined data to answer the business questions
But we can help, and timing costs is more of a planning issue for example not being clear on the efforts.
The process: Day 1 is installation. After a week a health check is done to see if data is flowing in the system. Day 1+14 initial reports and modeling can be started. Day 30+ and a business cycle is mined. This means enough data point have been captured, desktops that not often are connected have had their connection and thus agents, and variation to have a good analytics. Month start and month closing procedures have been captured. What about half-year procedures? No there not in a 30 days assessment when this period doesn’t include that specific procedure. Check with the business if those are critical.
Assess the assessment
What will be your information need and are there any specific objectives from the business they would like to see? If you don’t know what you are looking for, the amount of data will be overwhelming and it will be hard to get some reports out. Secondly try to focus on what and how an assessment tool can do for you. Grouping objects in reports that don’t exist in the current infrastructure or the organization structure will need some additional technical skills or need to be place in the can’t solve the organization with one tool category.
Secondly check the architecture of the chosen tool and how it fits in the current infrastructure. You probably need to deploy a server, have a place where its data is stored and need some client components identified and deployed. Check whether the users are informed, if not do it. Are there desktops that not always connect to the network and how are these captured. Agents connect to the Master once per day to have their data mined.
Thirdly check if data needs to remain in the organization boundaries, or that it can be saved or exported to a secure container outside the organization. To analyze and report it will be beneficial for time lines if you could work with the data offsite, saves a lot of traveling time throughout the project.
Fourthly what kind of assessment is needed. Do we need a desktop assessment, server assessment, physical to virtual assessment or something else. What kind of options do we have in gathering data, do we need agents, something in the network flow etc. etc. This kinda defines our toolbox to use. Check if vendors and/or community is involved in the product this can prove to be very valuable for the right data and interpretation of data in reports. Fortunately for me the tool for this blog post SysTrack can be used for all kinds of assessments. But for this EUC toolbox I will focus on the desktop assessment part.
SysTrack via Cloud
VMware teamed up with Lakeside Software to provide a desktop assessment tool free for 90 days called the SysTrack Desktop Assessment. It will collect data for 60 days and keep that data in the cloud for an additional 30 days. After 90 days access to the data will be gone. The free part is that you pay with your data. VMware does the vCloud Air hosting and adding the reports, Lakeside adds the software to the mix and viola magic happens. The assessment can be found at: https://assessment.vmware.com/. Sign up with an account and your good to go. If you work together with a partner be sure to link your registration with that partner so they have access to you information. When registration is finished your bits will be prepared. The agent software will be linked to your assessment. Use your deployment method of choice to deploy agents to the client devices, physical or virtual as long as its Windows OS. Agents need to connect to the public cloud service to upload the data to the SysTrack system. Don’t like all your agents connecting to the cloud, you can use a proxy where your clients connect to and the proxy connects to the cloud service. Check the collection state after deploying and a week from deploying. After that data will so up in the different visualizers and overviews.
If you have greyed out options, be patience there is nothing wrong (well yet). These won’t become active until a few days of data have been collected to make sure representative information is in there before most of the Analyze, Investigate and report options are shown.
Have a business cycle in and you can use the reports for your design phase. The Horizon sizing tool is an XML export that you can use in the Digital Workspace Designer (formerly known as Horizon Sizing Estimator) find it at https://code.vmware.com/group/dp/dwdesigner. Use the XML as a custom workload.
SysTrack On site
Okay, now for the on site part. You got a customer that doesn’t like its data on somebody else her computer, needs more time, needs customizations to reports, dashboards or further drill down options -> tick on site deployment. It needs more preparations and planning between you and the customer. If the cloud data isn’t a problem, let your customer start the SDA the have some information before having the onsite running, mostly it will take calls and operational procedures before a system is ready to install.
Okay so what do we need? First get a license from Lakeside or your partner for the amount of desktop you want to manage. You will get the install bits or the consultant doing the install will bring them.
Next the SysTrack Master server. Virtual or physical. 2vCPU and 8GB (with Express use 12GB) to start with, grows when having more endpoints. Use the calculator (Requirements generator) available on the Lakeside portal. Windows server minimum 2008R2 SP. IIS Web Roles, .Net Framework (all), AppFabric and Silverlight (brr). If you did not setup the pre-requisites this will be installed by the installer (but it will take time). That is… not .Net Framework 3.5 as this is a feature on servers where you need some additional location of source files. Add this feature to the system prior installation. And while you are at it install the rest.
For a small environment or without non persistent desktops a SQL Express (2014) can be included in the deployment. Else use an external database server with SQL Server Reporting Service (SSRS) setup. With Express SRSS is setup by the way.
You need a SQL user (or the local system) with DBO to the new created SysTrack database and a domain user with admin rights to reporting service and local admin on the Windows server. If you are not using a application provisioning mechanisme or desktop pool template, you can push or pull from the SysTrack Master. For this you need a AD user with local admin rights to the desktops (to install the packages) and File and Print Services, Remote management and Remote Registry. If SCCM or MSI installation in the template is used, you won’t require local admin rights, remote registry and such.
If there is a firewall between the clients (or agents or childs) and master server be sure to open the port you used in the installation, default you need 57632 TCP/UDP. And if there is something between the Master Server and Internet with the registration, you will need to activate by phone. Internet is only used with license activation though.
And get a thermos of coffee, it can take some time.
To visualise the SysTrack architecture we can use the diagram from the documentation (without the coffee that is).
Installation is done in four parts, first the SysTrack Master Server (with or without SQL), secondly the SysTrack Web Services, thirdly the SysTrack Administrative tools and when 1-3 are installed and SysTrack is configured, you can deploy the agents.
- SysTrack Master Server is for the Master for the application intelligence storing the data from childs (or connecting to data repository), configuration, roles and so on.
- SysTrack Web Services is for Front end visualizers and reporting (SSRS on SQL server).
- SysTrack Administrative Tools for example the deployment tool for configuration.
You gotta catch them all.
And click on Start install.
The installers are straightforward. Typical choices are the deployment type, full or passive. Add the reporting service user that was prepared (you can do this later as well). Database type, pre-existing (new window will open for connection details) for an external database or the express version. Every component will need its restart. After restarting the Master Setup the Web Services installer will start. After this restart, the Administrative Tools don’t start automatically. Just open the Setup and tick the third option and start the install.
Open the deployment tool. Connect to the master server. Add your license details if this is a new installation. Create a new configuration (Configuration – Alarming and Configuration). Selecting Base Roles\Windows Desktop and VMP will work for a good start in desktop assessments. Set your newly created as default, or change manual in the tree when clients have been added to the tree. And push the play button when ready to start or receive clients. Else nothing will come in.
Now deploy the agent via MSI. The installation files are on the Master Server in the installation location: SysTrack\InstallationPackages. You have the SysTrack agent (System Management Agent 32-bit) and the prerequisite C Redistributable’s VC2010.
With MSI deployments you add the master server and port to the installer options. If the Master allows clients to auto add themselves to the tree which is the default case with version 8.2, they will show up.
“Normally” the clients won’t notice the SysTrack agent deployed. There is not a restart required for the agent installation.
For strict environments you can have a pop-up in Internet Explorer about LSI Hook browser snap in. You can suppress this by adding the CLSID of LSI Hook to the add-on list with a value of 1. Or you can edit your configuration and change Web browser plugins to false. This in turn will mean that web data from all browser is not collected by SysTrack.
In any case be sure to test the behaviour in your environment before rolling out to a large group of client.
While the cloud is deployed within a snap and data is easily accessed within the provided tools and reports fit the why of the assessment, there is a big but.. Namely that a big chunk of organisations don’t like this kind of data to go into the cloud, even when the user names are anonymized. Pros for the on site version is that it will give you more customizations and reporting possibilities. Downside is that SysTrack onsite is Windows based and the architecture will require Windows licenses next to the Lakeside license. All the visualisers and tools can be clicked and drilled down from the interface, but it feels a little like several tools have been duck-taped together. You can customize whatever you want, dashboards, reports and grouping. You would need a pretty skill set including how to build SQL queries, SSRS reports and the SysTrack products themselves. And what about the requirement for Microsoft SilverLight, using a deprecated framework Tsck tsck. Come on this is 2017 calling….
But in the end it does not matter if SysTrack from Lakeside Software or for example Stratusphere FIT from Liquidwarelabs is used, that is your tool set. The most important part is to know what information is needed from what places and know thy ways to present these. Assess the assessment, plan some time and get mining for diamonds in your environment.
– Happy Mining!
Sources: vmware.com, lakesidesoftware.com