The post ITAM and Analytics: A perfect pair? appeared first on The ITAM Review.

This article was written by Rich Gibbons of ITAM Review and Peter Rowe of Crayon Australia.

Data is the new oil

Data is the new oil

An oft-quoted phrase, which has caused considerable disagreement as to whether it’s even correct. Nonetheless, it does have a particular resonance for the ITAM world – particularly if we listen to Clive Humby OBE.

A UK mathematician who, along with his wife Edwina Dunn, created the marketing company “dunnhumby” – perhaps most notable for creating the Tesco Clubcard, one of the first supermarket loyalty cards (launched in 1995); Humby is widely credited as the originator of the phrase “data is the new oil” and he put some interesting context around it:

“Data is the new oil. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value”

Oil must be refined. In its raw form it isn’t particularly useful to many people but once it is refined and changed – once something is made from the oil – it becomes useful to countless people in several different ways.

Data is the same – massive lists of users/licenses/hardware etc. aren’t really going to help anyone…but transforming that data to show where your users are by location, or the monetary value of licenses by vendor, or hardware that’s out of support in the next 6 months (for example) – that’s useful. That can help a business make decisions, reduce costs, and decide on strategy.

However, it’s almost certainly easier said than done.

Analytics

The Wikipedia entry on analytics describes it as, “the discovery, interpretation, and communication of meaningful patterns in data. It also entails applying data patterns towards effective decision making. In other words, analytics can be understood as the connective tissue between data and effective decision making within an organisation”.
ITAM Analytics

Finding the data

Never has this been truer than in the ITAM scenario where there is often considerable disconnect between the raw data, the meaningful analysis of that data and the subsequent decision making – Decision making that is often carried out regardless of the data and analysis potentially available, as it is often viewed as too complex to seek out and review.

As ITAM professionals, we all know that the data exists: That data can be found in various inventory tools that manage our desktops, datacentres and cloud environments; that data can be found in contract and end user licence agreements issues by vendors; that data can be found in procurement and purchasing systems and that data can be found in any number of spreadsheets, databases, bespoke solutions and repositories.

If analytics is “the connective tissue between data and effective decision making within an organisation” then a good step towards Analytics is the implementation of a centralised ITAM solution, tool or repository that brings all that data together. That is clearly the basis of any number of offerings from the ITAM / SAM solution vendors we are all familiar with.

Accessing the data

However, we must be aware that these solutions can create as many problems for analytics as they potentially solve. Whilst gathering the data, ITAM professionals must continually ask themselves several questions: How accurate is this information? Are the required “checks and balances’ in place to maintain its accuracy? How much work do I, or somebody else have to do manually to make it available for analysis? How current is it? Do we have all the data and information that we need?

Gathering that data can also be hampered and obstructed by requirements around security and data privacy, policies put in place to guarantee availability of the systems to users and other disaster recovery, or similar, processes.

Additionally, even if we assume that all ITAM and SAM solutions are equal in terms of the data they collect and present, due to the structure of their repositories and databases the data is often buried deep within these tools, and sometimes accessible only via the limited use cases envisaged by their designers.

As a result, the data may only be visible against individual records, at a certain level, or branch of the view and this makes it difficult, or even impossible, to identify exceptions and see the broad landscapes of asset usage and compliance that are often required. This limits the view available to ITAM professionals of the “meaningful patterns” required for true Analytics.

In many cases, even though the data has been collected, the tool or solution is simply unable to calculate the metrics required for analysis, or to display the correct information in a meaningful way. This is often a direct result of the data structures imposed on the tool or solution by its designers and programmers, as when the tool or solution was originally created, they had no idea of the changes to licensing that virtualisation or cloud would impose on the ITAM world.

Using the data

The methodology behind ITAM Analytics is to gather and organise the available data in a structured way and view through a single pane of glass, rather than multiple windows.

Whilst there are some benefits to very specific views related to individual publishers such as Oracle and IBM, overall the aims of any Analytics solution for ITAM should be a clear compliance position for each license. This included a view of licenses and ‘options’ together; instances / installations connected to servers, virtual machines and hosts, and linked to licenses; a view of data currency as complete and accurate as possible, as well as financial costing based on organisational policies.

It’s not easy

There is often no simple solution to the problem of developing and maintaining that “connective tissue between data and effective decision making within an organisation”. Many ITAM and SAM solutions offer in-built analytics solutions that are often chronically under-utilised due to a lack of skills or the data issues highlighted above. Other tools and solutions may offer easy integration to third-party Analytics solutions via APIs and other forms of connectivity.

With Analytics, as with so much else in the world of ITAM, there is no out-of-the-box solution; it’s good to learn from the experiences of others and choose what works for your organisation. Don’t feel that you’re alone – most organisations will have some level of BI happening and, if you’re lucky, there’ll be a tool and a team of people to help you.

3rd party tools

There is a veritable smorgasbord of 3rd party Business Intelligence (BI) and Dashboarding tools out there and you’ll probably find at least a couple of them already in use within your organisation (which is a good opportunity for some vendor rationalisation but that’s a conversation for a different day!). I’ve briefly called out a few here to give you a place to start.

Microsoft Excel

Probably the world’s most popular BI tool and there’s a very high chance you’re already licensed for this. The version you’re running can make a difference to the available features but there’s a lot of functionality in there, if you know how to use it!

Microsoft Power BI

Microsoft again, this time with their Power BI product which is gaining a lot of traction in many organisations and they’re now top right in the Gartner 2019 BI Magic Quadrant. If you’ve got Office/Microsoft 365 E5, you’re licensed for this – otherwise it’s available standalone. You’ll almost certainly have a plethora of Power BI free installations but beware – there are licensing implications there!

Tableau

Recently purchased by Salesforce for $15.7 billion, Tableau has been around since 2003 and are considered by many to be the #1 BI tool of choice, and occupy the #2 spot in the Gartner MQ. If your organisation has been working with BI for some time and/or you’re a large Salesforce house, you may well find this within your business already.

IBM Cognos

This is of particular interest for any Flexera customers, as Flexnet ships with IBM Cognos licenses – for use with Flexnet only. Founded in 1969 (that’s right!) and bought by IBM in 2008 for $4.9 billion, Cognos has continued to be developed and, although it currently sits in the bottom left of the Gartner Magic Quadrant, with the right know-how and skills it can still be a valuable tool for ITAM.

ITAM, Analytics, AI and ML

Finally, as an adjunct to ITAM and Analytics, it’s interesting to consider that in organisations that do take the step to treat their data as a strategic asset, new technologies like artificial intelligence (AI), and machine learning (ML) can assist in resolving many of the issues already highlighted in this article – removing silos, unlocking insights and automating informed decisions. Based on data, ML algorithms can be trained to recognise complex patterns and make intelligent decisions. The opportunity for ITAM is potentially huge as applied machine learning can contribute to both automation and streamlining of manual processes – of which there tends to be one or two (!) within most ITAM practices.

The post ITAM and Analytics: A perfect pair? appeared first on The ITAM Review.