Q & A With Bill Perlowitz: Prepping For The Big Data Overload

1

Bill Perlowitz, Wyle

Meet Bill Perlowitz, Vice President and CTO of Wyle’s Science, Technology and Engineering Group, a leading provider of specialized engineering, scientific, and technical services to the Department of Defense, NASA, and a variety of private sector clients. As a 30 year veteran in the federal contracting marketplace, he brings expertise in transforming and integrating enterprises using several methods including cloud computing, virtualization, Web services.

Perlowitz shared with WashingtonExec his thoughts on the challenges of introducing and establishing social media and new technology at federal agencies, the risks of securing “Big Data” and the solutions for solving the crisis of digital information overload.

WashingtonExec: What has changed over the 30 years you have been a Federal contractor?

Bill Perlowitz: We all reap the benefits of Moore’s Law, and the cost of computing to the consumer has fallen dramatically since the 1980s. By 2020, we can expect that video games will be running with the compute power, memory, and storage that is only available in two or three of today’s most powerful computers. By 2040, a $1,000 machine will have the equivalent capacity of a human brain and by 2050, it will exceed that capacity. Almost all of us have integrated the benefits of these technologies into our businesses and daily lives, and the government is no exception.

When the Obama administration began, the Technology, Innovation and Government Reform transition team immediately highlighted the disparity between public and private sector that had been growing for decades.  In response, the Chief Information Officer of the United States created a vision and implemented a set of policies that attempted to close this gap.

What has delayed implementation of this vision has been Moore’s Second Law, which tells us that the cost to develop each generation of the technologies rises exponentially. The government is not immune to these costs, and in an era of shrinking budgets that perpetually depend on technology to achieve cost reductions, there is more pressure than ever to directly accept commercial technologies such as cloud computing and mobility with as few government-specific changes as possible and still meet legal, policy, and security requirements.

WashingtonExec: So is the government’s use of technology catching up to the private sector?

Bill Perlowitz: The Intelligence Community has done a remarkable job of rapidly adapting technologies to meet their needs, but it is difficult to replicate their methods at the Department of Defense and Federal Civilian agencies where data is widely accessible. The General Service Administration’s Office of Citizen Services and Innovative Technologies has been the tip of the spear for bringing commercial technologies to government, and their Federal Risk and Authorization Management Program is the poster-child for creating government-wide “do once, use many times” frameworks that save money, time, and staff.

One of the difficulties of working within the Federal requirements framework is that as soon as it has adapted to a new set of technologies, such as social media, the private sector has evolved an entirely new, and sometimes disruptive, set of technologies. There are more than 60 billion intelligent devices in the world today, and this number will grow to 200 billion by 2015. If you consider a single modern smart phone as an example, it has a slew of sensors on it that are generating data 24×7:  GPS, accelerometer, light meter, camera, video, microphone, gyroscope, and wireless signals. When you add high speed 4G LTE technologies to these, you get the next generation of technical challenges which the market has dubbed “Big Data”.

WashingtonExec: What is your definition of Big Data?

Bill Perlowitz: Big Data is about extreme information processing and management, with an understanding that “extreme” is exceeding Moore’s Law; half of the data we have stored today was created in the last 2 years. By August, one-seventh of the world’s population will be generating data on Facebook. Large companies already typically store and process 10,000 to 10 million business events per second and the Cisco Visual Networking Index predicts that the volume of data transmitted across the Internet will rise from the 275 exabytes per year we saw in 2010 to 275 exabytes per day in 2020. In 2010 the world created over 13 exabytes (13,000,000,000,000,000,000) of stored data, which is about 60 times the Library of Congress, and we are adding over 2.5 quintillion (2,500,000,000,000,000,000) bytes of this data every day.

When we begin to look at things at this scale, we overwhelm existing data management practices and technologies. The term “Big Data” refers to these high volumes of data and that is the immediate problem, but right behind that are many difficult issues related to building efficient networks, storage, and processing to turn data into information, and the variety, velocity, complexity, quality of the data that impinge upon us make things much more complicated.

WashingtonExec: How do you think we will solve the Big Data problem?

Bill Perlowitz: There is a rapidly evolving field known as “Data Science” that combines elements of computer science, mathematics, statistics, and art necessary to store, find, make useful, and display large amounts of data in ways that humans can understand. This is very different from the roles that technologists have traditionally played because the questions we can now ask go far beyond those of traditional Business Intelligence, Enterprise Resource Planning, and Customer Relationship Management.

Universities are also beginning to understand and respond to the need for this amalgam of skills, and there are a handful visionary agencies and companies that are cross-training staff and deploying tools and systems that are allowing us to extract information from extreme quantities of information. I am very fortunate to work at Wyle, whose 60 year history and evolution from a laboratory gives me access to seasoned 4,800 doctors, scientists, pilots, astronauts, and engineers to give me a broad perspective on the collection, analysis, and interpretation of decades of data so that we can contribute to the evolution of the science.

WashingtonExec: How will solving Big Data challenges change things?

Bill Perlowitz: The deep technology base of Big Data enables almost limitless possibilities to develop competitive advantages and new revenue. In the private sector, analyst consensus is that organizations that successfully exploit Big Data by 2015 will financially outperform their competitors by 20%. Because our overall understanding of Data Sciences is nowhere near the levels they need to be to benefit from new technical capabilities and analytics, we need to rapidly develop substantially different processes, resources, technologies, vendors, and skills, or we will be overwhelmed by data we do not understand.

We can either address these issues now, or make massive reinvestments in 2 or 3 years to store, access, analyze, and visualize the inevitable deluge of data.

 

 

 

Comments are closed.