Technological advancements and the exponential growth of information are reshaping business operations across numerous sectors, including government. The generation of government data and the pace of digital archiving are accelerating, driven by the surge in mobile devices and applications, smart sensors, cloud computing solutions, and citizen-facing portals. As digital information becomes more expansive and complex, managing, processing, storing, securing, and disposing of this data grows increasingly difficult. New tools for capturing, searching, discovering, and analyzing data are enabling organizations to extract valuable insights from their unstructured data. The government sector is reaching a critical juncture, recognizing information as a strategic asset. Government bodies must protect, leverage, and analyze both structured and unstructured information to better serve the public and fulfill mission requirements. As government leaders work to transform into data-driven organizations to successfully achieve their missions, they are establishing the framework to correlate dependencies across events, people, processes, and information.
High-value government solutions will emerge from a combination of the most disruptive technologies:
-
Mobile devices and applications
-
Cloud services
-
Social business technologies and networking
-
Big Data and analytics
Big Data represents an intelligent industry solution that enables government entities to make better decisions by acting on patterns revealed through the analysis of large volumes of data—whether related or unrelated, structured or unstructured.
However, achieving this requires more than simply accumulating massive quantities of data. "Making sense of these volumes of Big Data requires cutting-edge tools and technologies that can analyze and extract useful knowledge from vast and diverse streams of information," wrote Tom Kalil and Fen Zhao of the White House Office of Science and Technology Policy in a post on the OSTP Blog.
The White House moved to assist agencies in finding these technologies by establishing the National Big Data Research and Development Initiative in 2012. This initiative allocated over $200 million to maximize the potential of the Big Data explosion and the tools required to analyze it.
The challenges posed by Big Data are nearly as formidable as its promise is encouraging. Efficiently storing data is one such challenge. Budgets are always tight, so agencies must minimize the cost per megabyte of storage while keeping data readily accessible so users can retrieve it when and how they need it. Backing up massive amounts of data further intensifies this challenge.
Effectively analyzing the data is another major hurdle. Many agencies utilize commercial tools that allow them to sift through vast amounts of data, identifying trends that improve operational efficiency. (A recent MeriTalk study found that federal IT executives believe Big Data could help agencies save more than $500 billion while also meeting mission objectives.)
Custom-developed Big Data tools are also enabling agencies to address their data analysis needs. For instance, the Oak Ridge National Laboratory’s Computational Data Analytics Group has made its Piranha data analytics system available to other agencies. This system has helped medical researchers identify links that can alert doctors to aortic aneurysms before they occur. It is also used for more routine tasks, such as sifting through resumes to match job candidates with hiring managers.
Read more...