Hitachi Data Systems was founded in 1910 with the vision to innovate with information. The company has more than 326,000 employees, thousands of which hold PhDs, and has nearly 1,000 subsidiaries.
After years of growing its federal business, HDS decided it was time to expand and provide additional services to its current and potential customers by establishing a new subsidiary, HDS Federal Corporation.
WashingtonExec sat down with Mike Tanner, president and CEO of HDS Federal Corporation, to discuss the current market climate, the “Big Data” buzz in the federal contracting community, and the growth of virtualized environments.
Read our interview below.
WashingtonExec: In April, Hitachi Data Systems Federal Corporation (HDS Federal) was established, as a new wholly owned subsidiary from Hitachi Data Systems Corporation (HDS). What prompted this business decision?
Mike Tanner: Over the past six years, we’ve been very successful in growing the federal business as a region within HDS. As we continued to look at opportunities for additional growth, we identified additional opportunities within the DoD and intelligence communities. In order to provide additional services to our current and potential customers, we needed to establish a facilities security clearance (FCL), which required us to separate our federal business into a new subsidiary. This subsidiary allows us to work closer with our federal customers to design and implement solutions that help them deliver on their missions.
“Our solutions offer unique capabilities that allow customers to gain greater utilization out of their existing assets and thus reduce expenditures.”
WashingtonExec: Given the current budget climate, what are some challenges or opportunities for HDS Federal? What’s your market outlook for 2013?
Mike Tanner: Every federal contractor will be challenged to grow its business during a period where budgets are flat or declining. At HDS Federal, the budget climate actually presents us with some great opportunities. In this current environment, agencies are looking to save money and reduce expenditures. Our solutions offer unique capabilities that allow customers to gain greater utilization out of their existing assets and thus reduce expenditures. Not only can we save our customers money, but we can also enhance their capabilities at the same time. This new budgetary reality also forces agencies to look at alternate solutions rather than simply continuing their typical purchasing patterns. They’re looking for cost-effective and efficient solutions, and HDS Federal delivers that.
Though the current market spending is flat or declining, we still expect to achieve year-to-year growth through aggressive outreach and demonstrating that our customers can save money by using HDS Federal’s solutions rather than continuing to buy from our competitors.
“Big data has emerged as one of the single greatest challenges facing government agencies as they grapple with not only managing their existing datasets, but also the massive amounts of new data from sources both inside and outside of government agencies.”
WashingtonExec: As an end-to-end storage solutions provider, heavily entrenched in the “Big Data” discussion, where do you see this trend in 5 years?
Mike Tanner: Big data has emerged as one of the single greatest challenges facing government agencies as they grapple with not only managing their existing datasets, but also the massive amounts of new data from sources both inside and outside of government agencies. To mitigate these challenges, federal IT executives are tasked with developing strategies to identify, classify, share and store all of these structured and unstructured datasets. In addition, agencies are looking to securely migrate many of their operations and storage capacity to the cloud. We’ve been working hand-in-hand with our customers to help them adapt to using storage virtualization to manage this transition to the cloud, and expect that business will only increase in popularity due to the significant cost savings. However, the focus on data should be on more than just storage, we need to manage data as information and find ways to turn it into knowledge for the users and their missions. This requires an integration of information sources, or “data fusion.”
HDS Federal has been working under the premise that as many of these data strategies start to mature, archiving will emerge at the forefront of the Big Data discussion as the newest method of information sharing across federal agencies. It will be crucial for agencies to deploy intelligent archiving solutions to index and search across tiered storage and multiple types of data. As virtual storage accumulates over the next five years, so will the demand for value-added content, computational and analytics capabilities that archiving provides.
WashingtonExec: You’ve used the phrase “data outlives the applications or devices that created the data” in the past. Can you elaborate on the importance of this statement?
Mike Tanner: In today’s interconnected world, there are enough proof points to show that data has more longevity than the process or device with which it was created. As technology evolves, data migrates from its original device to an updated version, or stored as backup elsewhere to be retrieved later. If data weren’t capable of persistence, it would get lost in the computer, smartphone or device that created it, which would be hugely detrimental to government and industry. As workforce mobility continues to boom, the ability to extract data from its physical constraints and transfer it through faster networks has provided agencies with better access to critical information and enhanced sharing capabilities. While this longevity of data provides greater access and enhanced optimization for government agencies, the sheer volume of it presents significant backup costs and storage challenges.
As an end-to-end storage solutions provider, we’re fortunate to have access to tools that create, capture, store and analyze data. Working within each stage has given us a deep understanding of the information lifecycle, such as how to properly prioritize data, capture and derive intelligence, as well as how to manage different data types through the lifecycle. Agencies that are unable to manage their data efficiently will exhibit lost productivity and inefficient use of their resources. Based on our experience, we empower agencies to apply their growing amounts of data for better decision-making and identifying data insight-driven opportunities.
“Any computer, smartphone, mobile device or IT technology will be outlived by its data. The sheer magnitude of data that is produced by the digital world can be overwhelming to think about – and rightly so.”
WashingtonExec: Can you give a current example or projection of devices or applications that will be outlived by the data they have created?
Mike Tanner: You can find several historic and evolutionary examples of data persistence. From past to present a few examples such as the data used for creating open office applications (MS Office, Google, etc.) was originally created by other applications such as early versions of Lotus Notes; different types of devices, such as tape drives, transitioned from DLT to AIT to LTO drives; the advent of social media (Twitter and Facebook) within the last three to four years; and the significant amount of data captured on cell phones today.
Any computer, smartphone, mobile device or IT technology will be outlived by its data. The sheer magnitude of data that is produced by the digital world can be overwhelming to think about – and rightly so. Keeping this in mind helps to provide context and perspective on the importance of staying involved in the entire data lifecycle from creation to capture, storage and analysis.
WashingtonExec: The “lowest price, technically acceptable” (LPTA) approach to IT procurement has generated a lot of buzz. Where does HDS Federal fall in the discussion?
Mike Tanner: HDS Federal can certainly compete in an LPTA environment. At the end of the day, most buying decisions are driven by price, and most agencies have to make sure they’re receiving proper value from each purchase. In evaluating LPTA, agencies should look beyond the initial capital costs, as the greatest costs come from asset management in the data center. If an agency does not consider those costs, they’re not necessarily reducing the costs of that solution. At HDS Federal we have a Chief Economist who is responsible for analyzing not only the cost of the capital equipment, but also the ongoing operational and management costs, that’s what should be evaluated in an LPTA model.
“We have an eight-year head start with regard to virtualization and have spent significant time concentrating on optimizing our elasticity and scalability to meet the unique needs of the federal government.”
WashingtonExec: What would you say is something that HDS Federal prides itself in?
Mike Tanner: HDS Federal takes prides in being recognized as one of the most ethically sound companies in the world, as well as a subsidiary of a top 100 best company to work for in the U.S., according to Fortune Magazine.
Having been established as storage virtualization experts, we are proud of the quality and reliability of the solutions we deliver to our customers. Our enterprise solutions provide a foundation of excellence and we take great pride in enabling government agencies to complete their mission by leveraging the benefits of storage portfolio. As virtualization continues to emerge and build momentum within the federal government, our competitors are working hard to address the trend and keep up with the demands of the market. We have an eight-year head start and have spent significant time concentrating on optimizing our elasticity and scalability to meet the unique needs of the federal government.
WashingtonExec: What differentiates HDS Federal from other “big data” organizations? What is the company’s competitive advantage?
Mike Tanner: As opposed to those who are focused on implementing siloed, proprietary systems at the customer’s expense, our solutions are agnostic and built on a single unified platform that makes them interoperable and compatible with existing third-party infrastructures. This allows agencies to manage all of their storage resources centrally. Our strategy gives agencies access to all of their data all of the time, regardless of which platform it resides. Users can therefore eliminate silos, optimize information accessibility, lower costs and reduce complexity.
“The complexity of Big Data problems is increasing in importance, so we must leverage expertise across the industry by partnering with others. Nobody has all the answers, but collaborating with other thought leaders makes us capable of creating highly effective solutions that meet or exceed our customers’ needs.”
WashingtonExec: What is the best advice you’ve ever received?
Mike Tanner: That’s a really hard question because I’ve received a lot of good advice over the years. I have to go back to my childhood when my father said, “Can’t never did nothing.” In other words, regardless of your success or failure, if you are not willing to try, then you have no chance for success. During the course of my professional career, a former manager once told me, “Don’t be afraid to step out of your comfort zone and take on additional tasks to help your team, organization, partners and customers.” I took that as meaning we all have to be comfortable in taking on new challenges. You manage the risks, but strive to do more in delivering value and results to our customers.
WashingtonExec: What do you foresee to be the most successful trends in “Big Data?” What are some speed bumps in “Big Data” and how is HDS Federal reacting to them?
Mike Tanner: It’s not possible to identify a single, overarching successful trend, but here are some significant observations that HDS Federal has made efforts to address in recent years.
Traditional structured data approaches are costly. Historically, federal IT used structured data approaches since it was well understood overall. However, agencies must find approaches to provide the same capabilities using unstructured data because doing so will reduce operational costs. HDS Federal uses the high-performance object storage capabilities from the Hitachi Content Platform, Hitachi Data Ingestor and the Hitachi Data Discovery Suite (search engine) products to build solutions that provide such capabilities at a much lower operational cost.
Data analytics is a major success factor to improved performance and service. HDS Federal is working with our partners to enhance core technology to better suit our customers’ needs and engage with partners who have demonstrated thought leadership in their various domains.
Data center management has exposed true operational costs. Running a data center can no longer be delegated to someone else. The architects of future data centers need to find ways to reduce physical plant expenses such as power and cooling, as well as find ways to manage the systems using fewer resources. HDS Federal is responding by providing infrastructure components that require less power and cooling while increasing the capability of the system. We also manage our components using a single tool and effectively operate the infrastructure using a “single pane of glass” system, allowing for deep control when required, but simplifying management.
The growth of virtualized environments continues to rise. Effective virtualization implementations are increasing within this sector. HDS Federal brings technology from the research labs to the market, which allows us to virtualize the entire infrastructure, including the storage, servers and network infrastructures. This moves us forward in being able to provide the agile capabilities demanded by users, but not yet supplied by the industry.
The complexity of Big Data problems is increasing in importance, so we must leverage expertise across the industry by partnering with others. Nobody has all the answers, but collaborating with other thought leaders makes us capable of creating highly effective solutions that meet or exceed our customers’ needs.