MEGABYTE Act Recommendations for CIOs (Part 1 of 2)

In light of the recent news that the MEGABYTE Act of 2016 was signed into law, we wanted to outline the law’s new requirements for CIO agencies, as well as provide our own recommendations for CIOs to achieve compliance by 2017.

In this post, we’ll take a look at the first three of six requirements.

1. Establish a comprehensive inventory

This includes 80 percent of software license spending and enterprise licenses in the agency. It is done by identifying and collecting information about software license agreements using automated discovery and inventory tools.

There is no reason why a CIO couldn’t achieve a discovery rate of 97-100 percent of all enterprise and infrastructure software installed on all endpoints and servers, including custom-developed software that is attached to the network.

Outperforming the 80 percent requirement to discover all software is not impossible.

Understanding what is running in your environment is the first step to not only managing but also ensuring that the applications meet security guidelines. Using either an agent installed on the endpoint or an agentless discovery tool that scans by IP range, it is possible to discover and build a definitive knowledge base of all known, approved applications. The data from these tools can then maintain a software asset catalog or a whitelist of approved applications.

Ideally, this discovery tool should use an agent so that it can monitor software usage. Software usage without an agent is only a point-in-time snapshot of what is running on the endpoint when the scan runs, and not daily usage monitoring. Usage monitoring on server software is not recommended because it may lead to network performance overload.

RecommendationCIOs should ensure that they have an agent-based discovery tool that can discover all device types–mobile, workstation and server–that can also monitor software usage.

2. Regularly track and maintain software licenses

This will assist the agency with implementing decisions throughout the software license management lifecycle.

With constant monitoring of the software license lifecycle, decisions about when to adopt new versions or upgrade operating systems are much easier to make.

Knowing the TCO and the costs associated with lifecycle decisions will provide the visibility needed to assess and model a decision on factors that go beyond purchase price.

Tracking and maintaining the software lifecycle can also directly solve a problem that has plagued the government agencies–legacy applications that cost a lot to maintain when vendor support is no longer available.

We’ve seen legacy operating systems that reached end-of-life a decade ago, still being used even though there is no valid business need for it. IT transformation efforts are often bogged down because the costs to update outdated applications is often prohibitive, even for the government.

Recommendation: CIOs should begin by getting a baseline report of all installed software and its associated lifecycle. If old versions of applications are discovered and newer instances are available, determine whether an upgrade should occur and if it is covered under maintenance.

3. Analyze software usage and other data to make cost-effective decisions

Most organizations monitor software application usage on a quarterly basis to detect which applications a user has opened and closed.

If a user hasn’t launched an application within the past 90 days, there is a good chance that they don’t need the application, unless it is an application that is only utilized during specific projects or year-end timeframes.

If software is not being fully utilized, it can be reclaimed from the endpoint and redeployed to fulfill another user’s request for that same application.

If there isn’t demand for that application and there are a large number of unused licenses, the agency should consider renegotiating that contract and discontinue maintenance on those applications.

In addition, effective software usage monitoring could potentially uncover that an enterprise agreement is not a cost-effective licensing alternative because employees are not utilizing what is installed on their endpoints.

On the server side, it might indicate that there are expensive applications with overlapping functionality that are only being partially used and the least used ones could be discontinued. In the “other data” category, a history of furloughs or staff layoffs and retirement could be used when forecasting software demand and server capacity requirements.

RecommendationCIOs should evaluate their existing discovery tools and if it has software usage capabilities, ensure that is implemented.

In many cases, inventory planners may not be aware of the functionality or may not have it fully deployed. Knowing the level of detail (e.g., app open/close or keystroke activity) that is needed for software usage monitoring is imperative.

InfoTech-blogbanner

Patricia Adams | IT Asset Management Evangelist

Patricia Adams is an industry expert who has more than 21 years of experience at Gartner. As a research director for Gartner, she was the lead analyst for IT hardware and software asset management tools and best practices. Her research coverage area included midrange tape storage, CMDB, change management, dependency mapping tools, discovery tool, and ITIL. Throughout it all, her primary focus was advising companies on how to establish governance programs unique to their organizations, build the business case for ITAM, measure the success of the program, design policies, define roles and responsibilities, implement tools, and build a strategic road map for integrating ITAM into other domains.

Adams spoke annually with federal, state, and local governments, as well as hundreds of companies across industries (e.g., higher education, healthcare, manufacturing, financial services) ranging from 1000 employees to global organizations that treated ITAM as a shared service and needed advice on centralizing the program. Her advice was especially relevant to organizations that needed an effective life-cycle management strategy that met organization and regulatory requirements.

While at Gartner, Adams developed the first Gartner TCO model for storage and the groundbreaking IT Asset Management Maturity model in 2001, which has been leveraged by many customers and vendors.