data management Archives | FedScoop https://fedscoop.com/tag/data-management/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Tue, 30 Apr 2024 16:05:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 data management Archives | FedScoop https://fedscoop.com/tag/data-management/ 32 32 Reframing data management for federal agencies https://fedscoop.com/reframing-data-management-for-federal-agencies/ Tue, 30 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77741 Norseman Defense Technologies CTO David Hoon explains why adopting an ‘event-driven’ data processing model offers a superior way to manage the growing amount of volume at the edge of agency networks.

The post Reframing data management for federal agencies appeared first on FedScoop.

]]>
David Hoon is the Chief Technology Officer at Norseman Defense Technologies.

In the ever-expanding realm of data management, federal agencies face a pressing need to rethink their approach to effectively processing, analyzing and leveraging vast amounts of data.

As enterprises generate growing volumes of data at the edge of their networks, they face an increasing disconnect: As much as 80% of the data lives at the edge of their enterprise. In comparison, as much as 80% of their computing takes place in the cloud.

That’s why chief information and data officials within federal agencies must recognize the necessity of adopting a different data processing model. One model gaining increasing attention involves moving more of an enterprise’s computing power to the edge of their network operations — and transitioning from a “transaction-driven” data processing model to an “event-driven” model.

Embrace an ‘Uber-like’ data processing model

Traditionally, data processing has been transaction-driven, where systems respond to individual requests or transactions. However, this model is proving increasingly inadequate in today’s fast-paced, data-intensive distributed environments.

In an event-driven architecture, applications respond to events or triggers, allowing for real-time processing and decision-making.

Uber provides a constructive example: Instead of requesting a car through a central dispatch office—triggering a transaction—a rider uses their Uber app to register a ride request. That request translates into an event notification. Uber’s application watches for and identifies such events continuously and notifies a multitude of available drivers simultaneously. The model results in connecting the most appropriate resource (the nearest available driver) to fulfill the request.

Similarly, an enterprise’s “event-driven” notification approach allows it to process more data events locally more quickly and cost-effectively.

Leverage Cloud-Native Data Streaming Platforms

One such solution making revolutionary headway in today’s data streaming era is Confluent Apache Kafka. Kafka is a cloud-native data streaming platform that facilitates the seamless movement of data between edge devices and the cloud. It enables agencies to manage data in real time, ensuring timely insights and actions based on evolving events. It also allows IT teams to capitalize on data streaming from the growing array of IOT sensors, mobile devices, and endpoint devices generating enterprise data.

Kafka’s capabilities extend beyond traditional transactional systems, allowing agencies to architect applications that are inherently event-driven. By adopting Kafka, agencies can also unlock new possibilities for data processing, analytics, and decision-making at scale.

Partner for success

Adopting this more modern approach requires looking at data and analytic flows differently. So, it helps to work with experienced companies like Norseman Defense Technologies, which has played a pivotal role in helping defense and civilian agencies craft the most appropriate implementation strategies. Norseman offers expertise, tools, and platforms to support agencies in their journey toward edge computing and event-driven architecture.

Norseman’s capabilities span from building proof of concepts to deploying production-ready solutions tailored to the unique needs of federal agencies. In addition, thanks to partnerships with major providers like HP, Intel and Microsoft, Norseman is well-equipped to empower agencies with cutting-edge technologies and best practices. For instance, Norseman has two HP Z Workstations utilizing Intel Xeon Processors and Microsoft 11 in our lab. These processors are purpose-built to process large amounts of data, including AI/ML/DL models.

Ultimately, by deploying more computing power at the edge of your networks, and adopting event-driven analytics architecture, agencies can make better decisions faster and unlock the full potential of their data assets, driving innovation, efficiency and mission success.

And by utilizing cloud-native data streaming platforms and the know-how of experienced industry partners, agencies can better position themselves to capitalize on modern data management practices as data and analytics operate increasingly at the edge. Now is the time for federal officials to embrace the future of data processing and lead with agility and foresight in an increasingly data-driven world.

Learn more about how Norseman Defense Technologies can help your agency embrace a more modern approach to data management at the edge with HP Z workstations using Intel processors and Microsoft 11. Z Workstations are purpose-built to process large amounts of data for AI and ML workloads. Visit Norseman.com or email info@norseman.com for our REAL INTELLIGENCE about ARTIFICIAL INTELLIGENCE.

The post Reframing data management for federal agencies appeared first on FedScoop.

]]>
77741
Grasping data modernization in state and local governments https://statescoop.com/grasping-data-modernization-in-state-and-local-governments/ Tue, 13 Feb 2024 20:30:00 +0000 https://fedscoop.com/?p=76032 A new report assesses the motivations driving agencies to modernize their data management strategies as they strive toward efficiency and innovation.

The post Grasping data modernization in state and local governments appeared first on FedScoop.

]]>
The post Grasping data modernization in state and local governments appeared first on FedScoop.

]]>
76032
Tackling broader IT modernization strategies https://fedscoop.com/tackling-broader-it-modernization-strategies/ Tue, 15 Aug 2023 19:30:00 +0000 https://fedscoop.com/?p=71908 Federal agency leaders discuss navigating data challenges and modernization opportunities.

The post Tackling broader IT modernization strategies appeared first on FedScoop.

]]>
Government agencies are undertaking an unprecedented IT and application modernization period to enhance operational efficiency, streamline processes and improve service delivery to citizens. Simultaneously, efforts to collect, manage and analyze vast volumes of data — combined with needing to operate securely in today’s multi-cloud operating environment — have placed tremendous challenges on agency officials to manage IT modernization initiatives.

In a recent video campaign, produced by FedScoop and underwritten by Broadcom, government agency leaders highlight the ways they are to navigating modernization initiatives to carry out their missions.  

Challenges of data management and accessibility

One of the most pressing challenges for government agencies is dealing with the exponential growth of data and the difficulties in storing, organizing and analyzing the information.

Office of Personnel Management CIO Guy Cavallo said that the timing of the cloud is perfect, as the explosion of data will be “unending.” He shared how OPM has done much work in the last year developing user requirement groups to segment who can see what data from a particular agency—helping with their long-term data management strategy.

Agencies are actively working to make data more accessible to a broader range of stakeholders, including policymakers, researchers and the general public. However, ensuring data accessibility without compromising security remains a balancing act. Lauren Knausenberger, Former CIO for the Department of Air Force, explained how they maximize data for their missions without causing security or governance concerns.

She said, “We want to make the data accessible to as many people [as possible], but at some point, it becomes a very challenging governance concern to think through who should have access to this data.” Like Cavallo, Kanusenberger explained the importance of micro-segmentation in securing their data, enabling decision advantage and shortening the kill chain moving data worldwide.

Dynamic data movement across cloud environments

Interviewees also explored how their migration to multi-could environments, coupled with the explosion of data, has changed their strategies for implementing solutions that facilitate their data’s free and secure movement.

Justin Fanelli, technical director for PEO digital and enterprise services at the Department of the Navy, said their goal is to “move data from anywhere to anywhere—a friction-free place. Fanelli explained how they’re prioritizing the functionality of their cloud-based activities—Operation Flank Speed is an example of successful implementation, where data can be accessed smoothly across various devices and domains, leading to improved user experiences.

State Department CIO Kelly Fletcher said they must be more “intentional about what data is stored and where data is processed.” She said that while the department operates in its own and commercial cloud environments, they also have computing capabilities at the edge. To make informed decisions and understand patterns, data must be aggregated from different sources. She also highlighted the significance of their network modernization initiative, which facilitates direct access to various data sources, moving away from centralizing all data in one location.

Opportunities for modernization efforts

Although there are many moving pieces, the rapidly evolving landscape of technology also presents opportunities for agencies as they seek to adapt to innovations. According to Broadcom CEO Hock Tan, federal agencies have two significant opportunities. First, policymakers should provide agencies with maximum flexibility in using modernization investments, as a one-size-fits-all approach may hinder optimal modernization where it’s most needed. Second, agency and commercial CIOs need to maintain their focus on innovation-enabling capabilities. Multi-cloud strategies, like the one facilitated by Broadcom’s VMware acquisition, offer the tools and flexibility necessary for a successful modernization roadmap.

He said the merger of the two companies would allow more investment in IT solutions that provide greater flexibility to modernize applications, manage software and services, and secure data across clouds. Tan said this will “give federal agencies the tools to control their destiny” in multi-cloud environments, all while “increasing choice and reducing risk.”

The state of relations between agencies and technology partners continues to evolve and grow stronger. “Our relationships with our technology partners are very important to modernizing our network,” said USPS CIO and Executive VP Pritha Mehra.

“Our partners are directly assisting us to upskill our workforce and infuse modern ways of thinking through professional services contracts. They are working with our teams to provide workshops, sandbox opportunities, training sessions and other activities to make our employees better-informed leaders, better technology strategists and highly skilled technologists,” added Mehra.

Other government leaders who shared their experiences include:

This video series was produced by Scoop News Group, for FedScoop and sponsored by Broadcom.

The post Tackling broader IT modernization strategies appeared first on FedScoop.

]]>
71908
Senators propose big boost in funding to upgrade DOD’s test ranges https://fedscoop.com/senators-propose-big-boost-in-funding-to-upgrade-dods-test-ranges/ Tue, 02 Aug 2022 19:00:19 +0000 https://fedscoop.com/?p=57100 The chairman’s mark of the fiscal 2023 defense appropriations bill recommends spending about $1.8 billion on these efforts.

The post Senators propose big boost in funding to upgrade DOD’s test ranges appeared first on FedScoop.

]]>
New legislation calls for spending more than $1.8 billion in the next fiscal year to improve the Pentagon’s infrastructure for developing and testing new and emerging technologies.

Congress allocated $798 million in fiscal 2022 to the Defense Department’s lab and test range modernization activities focused on space systems, the electromagnetic spectrum, hypersonics, directed energy, and targets. And for fiscal 2023, the Pentagon had requested $1.26 billion for “strategic test infrastructure improvements” as it looks to support testing of hypersonic weapons, electronic warfare, nuclear modernization, trusted AI, and multi-domain operations, according to budget documents.

But members of the Senate Appropriations Committee say that isn’t enough.

“While the Committee is encouraged to see that the fiscal year 2023 President’s budget request continues to make significant investments in test infrastructure, it notes that the requested funding does not adequately meet the needs of the test community. Therefore, the Committee recommends additional appropriations of $1,805,000,000 … for lab and test range upgrades for the following: electromagnetic spectrum, hypersonics, directed energy, space, targets, data management, artificial intelligence/autonomous systems, and the test & evaluation innovation hub,” said the report accompanying the chairman’s mark of the fiscal 2023 defense appropriations bill released last week.

The Defense Department’s major test ranges and facilities are a vast enterprise that includes 23 major sites across the continental United States, Alaska, Hawaii and the Kwajalein Atoll. They employ 30,000 people and cover about 18,000 square miles of land — more than half of the land owned by the DOD — and 180,000 square miles of airspace.

U.S. officials have been banging the drum about the need for improvements as the Pentagon pursues a wide-ranging modernization effort to acquire new capabilities to compete with advanced adversaries such as China.

“If we expect the department to attract the world’s best and brightest to produce state-of-the-art technologies, we must modernize our laboratory and test ranges,” Undersecretary of Defense for Research and Engineering Heidi Shyu told members of the House Armed Services Cyber, Innovative Technologies, and Information Subcommittee during a hearing in May.

Subcommittee Chairman Rep. Jim Langevin, D-R.I., said more funding is needed to address the infrastructure challenges that the DOD is facing.

“It is shocking that we face a massive backlog in laboratory investment, more than $5.7 billion in the latest report to Congress. These challenges affect not just the pace and breadth of innovation, but also our ability to attract and retain the top-tier talent that we depend on. I’m committed to doing everything in my power to address this issue,” he said at the hearing.

Provisions in the Senate Appropriations Committee’s version of the defense spending bill will have to be approved by the full Senate and then reconciled with the House’s version before they can be enacted. So it is unclear how much funding Congress will ultimately appropriate for DOD’s labs and test ranges in the next fiscal year.

The post Senators propose big boost in funding to upgrade DOD’s test ranges appeared first on FedScoop.

]]>
57100
Moving from predictive to data prescriptive analytics https://fedscoop.com/getting-to-prescriptive-analytics-data-management-platforms/ https://fedscoop.com/getting-to-prescriptive-analytics-data-management-platforms/#respond Wed, 12 Aug 2020 20:07:28 +0000 https://fedscoop.com/?p=37613 The evolution of modern data management platforms gives agencies not only greater insights, but greater ability to take recommended actions.

The post Moving from predictive to data prescriptive analytics appeared first on FedScoop.

]]>
Nasheb Ismaily is a seasoned big data and cloud software engineer. He currently works for Cloudera as a senior solutions engineer.

As federal agencies grapple with increasing demands to meet the government’s Federal Data Strategy and various data privacy and security regulations, the need for data transparency and traceability has taken on greater importance.

Nasheb Ismaily, Sr Big Data Solutions Engineer & Data Scientist, Cloudera

The patchwork of traditional agency systems and applications, however, is no longer up to the task to provide the transparency and traceability around data that agencies require today.

Given the detailed data-handling and compliance requirements of GDPR, GLBA, HIPAA, PIPEDA and CCPA, agencies and the organizations they work with need a comprehensive solution to help them understand and document how data is gathered, manipulated and enriched across an increasingly distributed environment.

Additionally, agencies must also manage an increasing amount of data being generated at the edge. While in the past, data from sensors streamed into the cloud to get processed, today there just isn’t enough bandwidth for this to work.

When those concerns are taken together with the intensifying demands to protect government data, federal agencies need to consider a more modern data management solution that can fit the current and future needs of their enterprise operations.

Rising to the challenge of integrating data

Today, many agencies contend that their custom-built point solutions are still suited to manage their data. The challenge they tend to create is an inability to track the overall flow of data. That not only limits agencies from tracing data provenance, but in case of a breach it becomes more difficult, if not impossible, to review what vulnerabilities led to the breach.

The greater the number of point solutions that information must travel through, the greater the chances that end-to-end tracing details will break down.

That’s where an integrated data management platform, which captures data from the edge and tracks its progress through the entire pipeline, can make a huge difference. It can trace how data came into the organization, who manipulated it, how it was enriched or how it was changed, no matter which cloud or server it resides on.

Modern data management platforms also have an advantage over point solutions in being to manage data throughout its lifecycle, including the ability to:

  1. Collect data as it streams across the enterprise.
  2. Enrich data by cleaning and structuring it, making it easier to analyze.
  3. Report data and generate meaningful visualizations.
  4. Serve data using operational databases.
  5. Make predictions with the data using machine learning and AI applications.

Underlying those capabilities is the ability to standardize and centralize data policies, security, governance and metadata and empowers agency leaders to get better insights something more.

Prescriptive analytics to course-correct

A lot of people are familiar with the notion of predictive analytics. Aided by machine learning and artificial intelligence technology, organizations can use data insights to predict probable outcomes. However, with the scale and velocity of today’s information demands, organizations need the ability to do more than to predict outcomes; they need to invoke responses automatically.

There aren’t enough skilled workers or hours in the day to monitor activity on the growing number of edge devices — or keep up with the pace of cyberthreats. That’s why platforms like those offered by Cloudera are increasingly needed to help enterprise leaders operationalize prescriptive analytics into corrective action.

Agencies are already familiar with variations of that in the cybersecurity space: If an intrusion occurs on a certain network, port, or device, an AI bot can automatically detect it and shut down traffic. And that kind of response can work in a variety of other circumstances across the enterprises, to better support employees and serve the public.

Cloudera’s DataFlow (CDF) technology enables this kind of actionable intelligence by solving several key challenges relating to the data, real-time insights, operations efficiency and security.

CDF is able to handle high volumes of data arriving with multiple different types of formats and a diverse set of protocols by leveraging NiFi. This technology, also called “Niagara Files”, was previously in development and used at scale within the NSA. It was made available to the Apache Software Foundation through the NSA Technology Transfer Program (TTP) and we are using it today for all types of data movement, enrichment and transformation.

In terms of real-time insights, the need for analyzing streaming data in real time and the ability to respond back to opportunities and anomalies is extremely important and that’s exactly where analytics come in. However, analytics cannot be hindsight. Predictive and prescriptive analytics take the data that is being streamed in, predict what’s going to happen, and prescribe what kind of corrective actions need to be taken. Cloudera enables this kind of actionable intelligence through our streaming analytics technologies which include Spark Streaming, Kafka Streams and Flink.

From an operations perspective, the lack of visibility into end-to-end streaming data flows, the inability to troubleshoot bottlenecks, and the need for understanding who’s consuming what type of data and who’s being the bottleneck are all examples of usability challenges that enterprises face. Cloudera solves these challenges and enables operational efficiency by providing analytic experiences across the entire streaming data lifecycle.

Finally, the comprehensive nature of CDF is managed, governed, and secured by Cloudera’s Shared Data Experience (SDX) allows organizations to trace its data in real time, from the edge all the way to AI.

The post Moving from predictive to data prescriptive analytics appeared first on FedScoop.

]]>
https://fedscoop.com/getting-to-prescriptive-analytics-data-management-platforms/feed/ 0 37613
Building a data strategy to improve decisions across lines of business https://fedscoop.com/build-data-strategies-improve-decisions-across-lines-business/ https://fedscoop.com/build-data-strategies-improve-decisions-across-lines-business/#respond Mon, 03 Aug 2020 20:00:17 +0000 https://fedscoop.com/?p=37720 Agencies like the U.S. Census Bureau are reducing operational costs and improving the security of their data with enterprisewide data strategies and tools, a new report says.

The post Building a data strategy to improve decisions across lines of business appeared first on FedScoop.

]]>
Agency leaders who want to gain greater insights from their data need to prioritize data integration and accessibility across their lines of business, says the CIO for a leading enterprise data services provider in a new report.

The White House Office of Management and Budget’s Federal Data Strategy 2020 Action Plan lays out processes for agencies to build capacity and utilize data more effectively. However, agencies still face the challenge of how to balance the requirements of the action plan with other IT priorities — and how best to implement data strategies in a secure and agile environment.

data management

Read the full report.

What’s required is for agency leaders to establish an enterprisewide data strategy and develop governance policies that take evolving tools and processes into account, while addressing multiple strategic objectives, says Henry Sowell, CIO of Cloudera Government Solutions in the report.

“When leaders focus on building a data strategy, the policy pieces they put together will help them take full advantage of the benefits [of data management tools],” says Sowell.  He also suggests that today’s advanced open source solutions give government greater long-term flexibility, regardless of how IT environments change, as evidenced by efforts at the U.S. Census Bureau.

Data lakes in action

The U.S. Census Bureau’s efforts to modernize its backend data management systems are cited in the report as being instrumental to the agency’s move to digitize the collection of 2020 census data.

Agency leaders recognized that the vast collection of data would require a modern management solution to generate insights that inform a wide range of decisions at all levels of government.

When the bureau started its digital overhaul, it established the Census Enterprise Data Lake (EDL) initiative. The end goal, according to the report, was to “provide coordinated capabilities to process, manage and analyze the flood of new data, while also satisfying security and privacy requirements.”

The Census Bureau uses Cloudera’s HDP (Hortonworks Data Platform) in its suite of tools to help mine, process and extract data it collected.

Additionally, improvements where made with its analytics tools that helped the agency easily clean its data by cross-checking with existing administrative records.

Overall, these efforts helped the bureau reduce redundant data collection, yielding better information to government decision makers, all while reducing total operating costs throughout the agency’s business operations.

Putting the data pieces together

While many agencies have point solutions to manage data during its lifecycle, Sowell says that modern data management platforms are built to deliver a consistent, standardized user experience across functions.

A unified cloud data platform will lower operational costs, he adds, because it encompasses the lifecycle of data management in one tool — including data capture, data movement, data engineering, data science and data storage.

It’s not just about integrating cloud-based tools but creating an environment that facilitates data mobility and seamless governance, with security policies that move with the data no matter where it resides, explains Sowell.

To build a strategy that is both enduring and flexible, Sowell says leaders should focus on six factors:

  • Establish a data strategy around mission goals.
  • Hire or retrain people to manage data and guide the data strategy.
  • Build a multi-cloud environment to ensure data mobility across the organization.
  • Integrate tools to be able to ingest, transform, query, optimize, analyze and do predictive work with data.
  • Ensure universal security and governance policies through the lifecycle of the data.
  • Use tools that are open source and future-ready to integrate artificial intelligence and machine learning capabilities down the road.

Sowell explains how Cloudera’s Shared Data Experience (SDX) platform facilitates the large-scale exchange of information, resulting in faster analysis, while also assuring consistent security and governance across any cloud or on-premise environment.

SDX allows data managers to set security and governance policies once and apply them across all data workloads. Those policies stay with the data even as it moves across all supported infrastructure, Sowell explains.

“Enterprisewide layers of data security with technologies such as Cloudera’s SDX provide tools that make an agency’s enterprise data cloud secure by design,” says Sowell.

Read more about creating an enterprisewide data strategy for better insights and security.

 This article was produced by FedScoop and sponsored by Cloudera.

The post Building a data strategy to improve decisions across lines of business appeared first on FedScoop.

]]>
https://fedscoop.com/build-data-strategies-improve-decisions-across-lines-business/feed/ 0 37720
Why data virtualization offers a better path to decision making https://fedscoop.com/data-virtualization-offers-better-path-decision-making/ https://fedscoop.com/data-virtualization-offers-better-path-decision-making/#respond Tue, 30 Jun 2020 20:27:44 +0000 https://fedscoop.com/?p=37307 Data lakes have been all the rage, but for federated data, virtualization fabrics offer greater speed, agility and utility for more users.

The post Why data virtualization offers a better path to decision making appeared first on FedScoop.

]]>
It’s hard to ignore the clamor inside and outside of government circles for agencies to harness data more effectively.

One of underlying lessons revealed thus far by the ongoing pandemic has been the challenges federal and state leaders have faced trying to obtain current, reliable data in roughly real time in order to make critical public policy decisions.

Cuong Nguyen, VP, Federal Sales (left) and Mark Palmer, GM of Analytics, Data Science and Data Virtualization (right) from TIBCO

Within government agencies, however, there’s a deeper issue: How to manage vast repositories of data — and develop more cohesive data governance strategies — so that whether you’re an analyst, a program manager or an agency executive, you can get the information you need, when you need it and the way you need it.

That’s why data virtualization is emerging as a pivotal solution in the eyes of a growing number of IT experts.

One of the ironies of the Big Data movement over the past decade is that while it helped organizations come to grips with the explosion of data being generated every day, it also caused a bit of damage by overpromising and underdelivering. The evolution of the cloud and software like Apache Hadoop was supposed to help government agencies, for instance, pour their siloed data sets into vast data lakes, where they could merge, manipulate and capitalize on the previously unseen value lying dormant in all those databases.

A lot of CIOs said, “Great, we have this gigantic data lake, and we can dump everything into them, and that will resolve all or most of our data sharing problems.”

Yet, over time, what many agencies discovered they had created was more of a data swamp.

Part of the problem stems from the fact that organizations hadn’t always taken the necessary steps to resolve and apply adequate data governance structures to their digital assets. Another factor is the extent to which agencies still need to complete the work of inventorying and properly cataloging those assets.

But perhaps the biggest issue is the assumption that data lakes make sense in the first place.

For many organizations, data lakes make data readily discoverable and easy to analyze. But in government, where data tends to remain federated, there are inherent inefficiencies in physically pooling data. It’s not that people don’t want to share data — although some are more willing to share than others — it’s just that it’s complicated to do so.

It often takes a dozen or more applications to properly locate, identify, authenticate and catalog data before migrating it to a data lake or the cloud; and depending on the type of data, it can also require a suite of other products to harmonize it with other data and make it useful to different stakeholders.

And the question is, why are we doing that if there’s a simpler, faster and more functional way to deliver the same end results?

A growing number of organizations are in fact discovering there is a better alternative —data virtualization.

A data virtualization system essentially creates a virtual fabric, delivering an easy-to-consume view across hundreds of data sources.  Instead of lifting and shifting source data to the lake or cloud, data virtualization directly accesses the original sources, providing the equivalent access to all the sources, but with about one-tenth the development time and about the same query speed.

Data virtualization offers several advantages:

First, it provides a much more agile, less resource-intensive way of getting the data into the hands of decision makers. For instance, it can give agency analysts the ability to aggregate views of various sets of data, port selected data more easily into visual analytics tools and ultimately perform different predictive analytics exercises all without having to actually assemble all that data in temporary repositories.

Second, It gives different users — and more employees overall — the self-service ability to find and view the data in ways that are most relevant to their needs.  Consider all the people working on COVID-19 data. You’ve got data scientists, who need to analyze what’s happened and come up with a predictive models. You’ve got business analysts, who need to understand the evolving impact on agency operations as employees began working remotely. And you have public policy makers, who need to decide what steps to recommend to national leaders. They are all looking at the data in a completely different ways. Data virtualization allows agency employees to spin up needed data sets in the right format, at the right time for the right persona, so it’s useful to those who need it.

Third, data virtualization can help agencies not only pivot faster around emergency management, but also quickly address other cross-agency issues that no single user might discern, such as sudden changes in supply chains or emerging instances of fraud — all in a matter of hours or days, not months.

Fourth and finally, virtualization also can go a long way in helping agencies streamline their federal data management strategies, by creating a uniform approach to accessing and utilizing federated data across their agency and across multicloud environments, without all the heavy lifting that commonly occurs using data lakes.

If the pandemic taught us anything, it’s that when time is of the essence, thinking outside the box — or in this case, beyond the data lake — is not only possible, but for many agencies, can also be more advantageous.

Mark Palmer is General Manager of Analytics, Data Science and Data Virtualization and Cuong Nguyen is Vice President of Federal Sales at TIBCO.

Find out more on how TIBCO can help your agency master and virtualize its data.

The post Why data virtualization offers a better path to decision making appeared first on FedScoop.

]]>
https://fedscoop.com/data-virtualization-offers-better-path-decision-making/feed/ 0 37307
Air Force awards $630M contract to move weather forecasting to the cloud https://fedscoop.com/air-force-saic-weather-forecasting-contract/ https://fedscoop.com/air-force-saic-weather-forecasting-contract/#respond Fri, 26 Jun 2020 16:04:10 +0000 https://fedscoop.com/?p=37277 The Air Force issued a major contract to modernize it's weather forecasting and shift to a "software-centric, cloud-based" approach.

The post Air Force awards $630M contract to move weather forecasting to the cloud appeared first on FedScoop.

]]>
The Air Force‘s 557th Weather Wing selected SAIC to develop and modernize “critical” hardware and software for advanced weather forecasting and move to a more cloud-based system.

The indefinite-delivery, indefinite-quantity contract has a ceiling of $630 million to work on the Technology Application Development and Sustainment (TADS) system for the unit. The contract includes functions like application development, software integration, application infrastructure, cloud migration, hardware, security and data management, according to a news release.

The Air Force wants to modernize its weather forecasting to better predict storms that could impact operations. Much of forecasting comes down to data management, which SAIC said it will improve through a more cloud-based approach.

“TADS will move the service to a software-centric, cloud-based approach to efficiently respond to mission needs, including disaster recovery and increased collaboration,” says the release.

The new systems that SAIC will deliver to the Air Force will allow for larger data analysis and to implement machine learning algorithms to use that data.

“Air Force warfighters need the latest and most accurate weather data to effectively carry out their mission, and TADS will provide them with innovative new capabilities to provide critical environmental situational awareness when and where they are needed,” said Bob Genter, SAIC executive vice president and general manager.

The post Air Force awards $630M contract to move weather forecasting to the cloud appeared first on FedScoop.

]]>
https://fedscoop.com/air-force-saic-weather-forecasting-contract/feed/ 0 37277
The key to organizing government data for faster decision-making https://fedscoop.com/key-data-management-faster-decision-making-government-leaders/ https://fedscoop.com/key-data-management-faster-decision-making-government-leaders/#respond Wed, 10 Jun 2020 20:26:14 +0000 https://fedscoop.com/?p=36868 Data experts highlight the essential components agencies need to master their data in a multicloud environment for more effective analysis and decision-making.

The post The key to organizing government data for faster decision-making appeared first on FedScoop.

]]>
The Federal Data Strategy provides agency officials a framework to get more from their data. But agencies will need robust integration tools to master, not just manage, that information, say experts in a new report.

That includes automated tools to fully identify and catalog government data, as part of a data management strategy, so that agency and program leaders have greater assurance about the quality of the information they rely on to make decisions.

data management

Read more from the full report.

This challenge has been strikingly clear as the nation continues to adjust to the impact of the COVID-19 pandemic, says Michael Anderson, chief strategist for public sector with Informatica.

“When you look at all the predictions, all the analytics going into decisions on whether to shut down [businesses], when and for how long, it all depends on having clean, timely data, run through a decision model or an AI tool. If an organization is not set up and prepared to do that before a crisis hits, they’ll run into some of the problems many are having now,” he explains in a new report, produced by FedScoop and underwritten by Informatica.

While the government’s continuing migration to cloud services has given many agencies newfound capabilities, officials are still finding it difficult to locate, share and analyze reliable data quickly in order to make critical decisions affecting their constituents, says the report.

Cloud experts like Susie Adams, chief technology officer at Microsoft Federal, have seen how widely-distributed pools of information make it challenging for federal agencies to assemble their data in order to migrate workloads to the cloud.

She shares in the report how the need to find and collect datasets hinders the ability of agencies to take fuller advantage of high-powered cloud data analytic tools.

“When agencies start to investigate big data, artificial intelligence, machine learning and data analytics technologies to analyze very large data sets, one of the biggest challenges agencies have is that the datasets are distributed and stored in multiple disparate locations,” says Adams.

The report highlights two foundational competencies agencies need to establish in order to master their data.

The first, and most important, part of an overall data management program is having data governance in place. This helps an agency to establish the ground rules for defining data and determining systems requirements and processes to ensure data quality. A key requirement for data governance, says Anderson, includes having comprehensive data glossaries that standardize the formatting and meaning of data.

The second foundational component is having a robust and automated cataloging tool to properly identify, tag and process your data at scale, says Adams.

“Once your data has been properly cataloged, getting it migrated and then standing it up in the cloud, can be pretty straightforward,” Adams shares.

Anderson compares the data cataloguing challenge to finding a resource at the Library of Congress. “if you don’t catalog the books in a comprehensive, [automated] way — that takes advantage of embedded artificial intelligence and that will help you put in a data query and identify related datasets — you’ll likely overlook all kinds of meaningful information,” he says in the report.

“One of the reasons all of the leading cloud providers, including Microsoft, work with Informatica is the comprehensive array data management tools that Informatica offers. Informatica’s experience working with large government enterprises for over two decades has also helped the company keep innovating,” says the report.

No fewer than five of Informatica’s enterprise management solutions are recognized as leaders in Gartner’s Magic Quadrant Reports.

Read more about data management tools that help agency leaders master data.

This article was produced by FedScoop and sponsored by Informatica.

The post The key to organizing government data for faster decision-making appeared first on FedScoop.

]]>
https://fedscoop.com/key-data-management-faster-decision-making-government-leaders/feed/ 0 36868
How agencies can use data to maximize services while teleworking https://fedscoop.com/data-management-to-maximize-agency-services-and-telework/ https://fedscoop.com/data-management-to-maximize-agency-services-and-telework/#respond Thu, 21 May 2020 20:10:47 +0000 https://fedscoop.com/?p=36696 By using the aggregated data around service availability, employee performance and security, agencies can respond effectively in times of crisis, says a Splunk white paper.

The post How agencies can use data to maximize services while teleworking appeared first on FedScoop.

]]>
As government workforces continue to operate away from the office, agency decision-makers need to leverage their data to help maximize services, employee productivity and security, according to a new white paper from Splunk.

“Splunk Solutions for COVID-19 Response” calls on agencies to take advantage of data management solutions to improve how government is responding to the coronavirus pandemic, as well as position themselves for success in a post-pandemic world.

data management

Read the full report.

“Splunk is helping organizations leverage their data during this crisis so they can respond in ways that can help them thwart the pandemic’s ill effects,” the white paper reads. “As a trusted provider of security, IT monitoring and mission analytics, our solutions are ideally suited to aggregate disparate data from any source, regardless of structure, in real-time and at scale.”

Through the use of the Splunk “Data-to-Everything” platform, agencies can gain real-time insights on a large variety of their data, including network availability and uptime, VPN connections and usage, and time-to-resolution for service requests.

As agencies deal with a continuing surge in remote work, leadership also needs to identify how to facilitate productivity among personnel. And beyond that productivity, agencies need to ensure the communication and work being done remotely is happening in a secure environment.

The report details a curated list of data management solutions that to help facilitate that shift such as:

  • Remote Work Insights Autobahn – to deliver real-time analysis of connected devices and remote systems
  • VictorOps – an automated incident response management tool
  • Phantom – and orchestration and automation platform

This suite of solutions will help agencies onboard key data sources to monitor performance indicators, identify emerging issues and perform deep root cause analysis.

Agency leaders need to streamline their security posture, mitigate risk and expose hidden security and operational gaps that can make systems vulnerable. Splunk Security Essentials is a free app aimed at making security simpler, allowing users to validate data sources, capabilities, as well as to test and implement detections mapped to cybersecurity frameworks.

Going forward, to make these efforts successful, migration to a secure cloud environment is going to be key, the white paper says.

“[Cloud] is purpose-built to endorse flexibility and deliver secure access,” the white paper says. “As agencies migrate to cloud and hybrid locales, end-to-end operational visibility is essential before, during and after the transition to maintain insights into performance and address concerns related to infrastructure and application visibility.”

Using those data-driven insights into service effectiveness and employee productivity, agencies can map out how their initiatives will deliver on intended outcomes even beyond the current response to the coronavirus pandemic. In addition, using that data, leaders are able to make confident decisions while managing risk long-term.

Learn more about how your agency can gain additional insights into data and improve response in times of crisis.

This article was produced by FedScoop and StateScoop for, and sponsored by, Splunk.

The post How agencies can use data to maximize services while teleworking appeared first on FedScoop.

]]>
https://fedscoop.com/data-management-to-maximize-agency-services-and-telework/feed/ 0 36696