data analytics Archives | FedScoop https://fedscoop.com/tag/data-analytics/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Wed, 08 Nov 2023 18:45:35 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 data analytics Archives | FedScoop https://fedscoop.com/tag/data-analytics/ 32 32 Mud, sweat and data: The hard work of democratizing data at scale https://fedscoop.com/mud-sweat-and-data-the-hard-work-of-democratizing-data-at-scale/ Thu, 05 Jan 2023 01:30:00 +0000 https://fedscoop.com/mud-sweat-and-data-the-hard-work-of-democratizing-data-at-scale/ Effective decision-making starts with managing and analyzing surges of data across agency leaders, teams, and missions.

The post Mud, sweat and data: The hard work of democratizing data at scale appeared first on FedScoop.

]]>
Rick “Ozzie” Nelson is the Senior Vice President and General Manager, Public Sector at MicroStrategy. He is responsible for public sector data analytics and business intelligence solutions to enable inventory and asset management, financial and workforce planning, and mission and operational readiness.

Rick “Ozzie” Nelson, Senior Vice President and General Manager, Public Sector, MicroStrategy

Data is the backbone of every business decision today. However, organization leaders need to ask themselves two key questions: Are the right people getting the right information to make decisions? And are they empowered with the knowledge necessary for discerning critical insights from their data to make insights actionable?

Organizations are still struggling with these objectives. According to a survey by Ventana Research, most organizations (72%) estimated that one-half or less of their workforce are using the available analytics tools when needed. That’s despite advances in business intelligence platforms that make the gathering, analyzing, and distilling of information more accessible than ever to today’s workforce. The stakes are only getting higher when it comes to democratizing enterprise data. By 2025, data creation is forecasted to surpass 180 zettabytes, more than double the amount in 2020. And the business intelligence market—which helps transform data into actionable insights—is similarly experiencing explosive growth, reaching annual sales of $43 billion by 2028.

Keeping pace with all that data and up-and-coming analytics tools is a strategic imperative for avoiding disruptions to operations and making better-informed tactical business decisions. For public sector leaders, taking a modern approach to business intelligence translates into better-executed missions. That includes adopting BI platforms that can:

  • facilitate faster decisions with action-oriented workflows,
  • offer the ability to control multi-source data at an enterprise scale,
  • improve agility with reusable object models,
  • and provide greater insight accessibility.

Broadening BI’s power

But the value and promise of business intelligence can only benefit organizations if the right users have access to the data and analytics processes. Meaning not just analysts or C-suite decision-makers but those tasked with managing and monitoring front-line tasks that drive an organization’s mission.

One challenge common to private as well as public sector organizations is the need to break down data silos. In the past, the primary users of BI tools and applications were the IT departments. But as BI tools have become more interactive, intuitive, and user-friendly, new levels of data integration and collaboration between IT and business owners have become possible. Those advances are also fueling greater data democratization by unearthing insights historically trapped in data silos.

The experts at Ventana Research write, “Adopting user-friendly techniques such as natural language processing and augmented intelligence help deliver insights that are easier to understand. And by generating insights automatically and delivering them in ways that are easily accessible and understood by the workforce, it will increase the benefits of the data an organization collects and processes.”

Access to timely and reliable data and analytics is critical to the success of any enterprise—whether it’s in finance, contracting, operations, or logistics and supply chain. Organizations can no longer afford to cultivate an inner circle of data experts; instead, they need to cultivate expertise throughout their entire workforce.  

Shortening the hiring cycle.

We’ve seen how the use of real-time data and available AI tools can transform decision-making in a variety of applications. One example, with wide applicability, involves federal hiring practices. Hiring talent for mission-critical positions can take many months, and vacancies can overwhelm staff and jeopardize mission delivery.

That was the case for a federal law enforcement agency, which at one point faced more than 1,000 vacancies across different job functions. So the agency’s Human Resources department leveraged MicroStrategy’s BI platform and an ML-prediction model that forecasted vacancies and how long it would take to bring new employees on board. Supervisors were then given data ownership to forecast staffing plans and analytic tools to identify more productive decision-making factors. Within a year, vacancies were reduced by 80%.

Thumbs-up all around.

As of November 2022, MicroStrategy obtained FedRAMP certification, which means more opportunities for agencies to benefit from our cloud-based analytics platform. The certification will allow us to continue helping government customers make data-driven decisions on a larger scale by leveraging the latest offerings of BI platforms and tools.

But the real benefit comes when organization leaders put data and analytics tools into the hands of everyone in the workforce, from the office to the field, and teach them the best data practices for improved mission capabilities. One federal agency said in a MicroWorld public sector webinar that “[data] has become so ingrained in our business operations that we now have product managers to serve as the bridge between the technology, the training, and the people leveraging that data to ensure that it’s used strategically and delivers the best business outcomes.”

Another example involved managing the flow of daily reports at the Transportation Security Administration. Using MicroStrategy’s BI platform, TSA was able to deliver more than 45,000 reports per day to more than 40,000 users. According to a TSA executive in a recent webinar. “It supports a full suite of reporting styles, and we find that’s very helpful in being able to provide airport operations data where and when it’s needed. There’s just so much information in the tool—we have recognized over $18 million in cost savings and avoidance each year using it,” he said.

On top of FedRAMP Certification, MicroStrategy enterprise analytics platform earned top marks with Gartner analysts this past May for the platform’s outstanding reporting, security, governance and catalogs, and enterprise analytics capabilities, scoring well above the market average on all four of these enterprise analytics capabilities. Analysts noted MicroStrategy’s governance capability was powerful at telemetry, usage analytics, and facilitating the promotion of certified content; for security, MicroStrategy “offers best-in-class auditing, authentication and authorization.”

Learn more about how we can help your organization unlock the business value of your data at World23. This four-day conference features keynote events, 80+ sessions on enterprise and embedded analytics, and education courses for platform users.

The post Mud, sweat and data: The hard work of democratizing data at scale appeared first on FedScoop.

]]>
63718
How data analytics helps law enforcement shave time off investigations https://statescoop.com/how-data-analytics-helped-a-california-police-department-shave-a-year-off-an-investigation Fri, 28 Oct 2022 23:31:00 +0000 https://fedscoop.com/?p=63076 Law enforcement agencies are discovering how Splunk’s data analytics platform can fast-track digital investigations, according to a new report.

The post How data analytics helps law enforcement shave time off investigations appeared first on FedScoop.

]]>
 

The post How data analytics helps law enforcement shave time off investigations appeared first on FedScoop.

]]>
63076
Census Bureau shares analytics insights, not data, in pilot with IRS https://fedscoop.com/census-bureau-irs-analytivs-pilot/ Tue, 25 Oct 2022 02:13:34 +0000 https://fedscoop.com/?p=62901 The Census Bureau didn’t want to share publicly provided survey data with the IRS, lest it be used to collect taxes.

The post Census Bureau shares analytics insights, not data, in pilot with IRS appeared first on FedScoop.

]]>
The Census Bureau found a way around sharing its data with the Treasury Department, while still helping ensure underrepresented groups were receiving timely tax refunds: sharing its analytics insights, instead.

As part of an ongoing pilot, Treasury had the IRS provide the bureau with data it used to build a model of race and ethnicity at the micro and person level. The bureau’s model was then matched with other IRS data to uncover biases.

The Census Bureau didn’t want to share publicly provided survey data with the IRS, lest it be used to collect taxes, but recognized tax data lacks race and ethnicity codes the bureau uses.

“We’re not sharing the data,” said Ron Jarmin, deputy director of the Census Bureau, at the ACT-IAC Imagine Nation ELC 2022 on Monday. “But we’re sharing insights from the model that’s based on our data, that makes their data model much more powerful.”

Not only can Treasury verify equitable distribution in the timing of tax refunds, but it can compare the timing of check mail deliveries with that of direct deposits.

Meanwhile the IRS and other agencies can be confident the Census Bureau won’t release the findings, Jarmin said.

“We’re hoping that this is something that we’ll be able to do with other state, federal and local government agencies to help them,” he added.

The Census Bureau has used administrative data from other agencies for decades, but now it’s looking to use unstructured data. For instance, the bureau is trying to use transaction-level data from retailers for more timely, granular sales statistics to improve its own price analyses, Jarmin said.

“If we can do more accurate statistics using the sort of unstructured, transaction-level data that gets generated every time you purchase something, we should be able to produce far better statistics,” Jarmin said. “And maybe stop doing surveys every month of retailers across the country.”

The post Census Bureau shares analytics insights, not data, in pilot with IRS appeared first on FedScoop.

]]>
62901
DOD to roll out new online marketplace to speedily buy AI-aligned tech https://fedscoop.com/dod-to-roll-out-new-online-marketplace-to-speedily-buy-ai-aligned-tech/ Fri, 02 Sep 2022 18:16:08 +0000 https://fedscoop.com/?p=59673 It's envisioned to serve as the Pentagon's "digital environment of post-competition, readily awardable, technology solutions."

The post DOD to roll out new online marketplace to speedily buy AI-aligned tech appeared first on FedScoop.

]]>
The Pentagon’s Chief Digital and Artificial Intelligence Office (CDAO) is preparing to launch a new one-stop online “marketplace” to solicit, evaluate and curate technologies specifically associated with AI, machine learning, data and analytics — and also enable Defense Department components to rapidly buy those digital capabilities, according to a recently published special notice. 

With its partners in the Army Contracting Command-Rock Island and the Indiana Innovation Institute (IN3), the CDAO is targeting the first quarter of fiscal 2023 to “go-live” with the minimum viable product of this new “Tradewind Solutions Marketplace.”

Between now and Sept. 30, officials involved are crowdsourcing suggestions from industry, academia and government agencies on the concept and framework underpinning that hub and how such organizations could help “shape” it.

After that date, some comments received may be shared publicly. But those behind the emerging vendor space also plan to continue to collect feedback and engage interested parties throughout the existence of the Marketplace initiative.

In early 2021, DOD’s former Joint Artificial Intelligence Center (JAIC) — which was recently absorbed into the CDAO — and the Army awarded an Other Transaction agreement to IN3 to create a business process and online environment called “Tradewind,” that would drive more efficient AI acquisitions for the U.S. military. Last month, FedScoop reported that the partners recently updated the tradewindai.com website and it is being tested as a modern channel to announce new AI-aligned challenge competitions.

On Aug. 26, the Tradewind Solutions Marketplace announcement was posted there and on the SAM.gov federal government contracting site. Both link to the Tradewind Exchange Challenge Summary landing page where responses are to be submitted.

Specifically, officials want feedback on how they might best provide a venue where defense and military insiders can search for the technologies of interest, and a single location to interact with external organizations that can deliver them through an established rapid contracting pathway.

A 10-page, attached “teaser” draft of the marketplace open call also provides details behind the evolving, and subject to change, notions informing the making of this new online shop — as well as some information about how it will operate.

“The Solutions Marketplace serves industry and academic organizations by providing a forum to showcase relevant research, products, and services to prospective government customers, and serves DOD by providing a forum to access data, analytics, digital and AI/ML solutions and rapidly ingest game changing technology solutions,” the document said.

Envisioned as “a digital environment of competed video pitches,” the marketplace will be designed as a venue for customer organizations “to search, view, review, compare, contrast, contact, negotiate, and procure data, analytics, digital and AI/ML” technologies.

The overarching idea is that once video pitches of capabilities pass through a deep assessment to ensure compliance with federal requirements, and they are approved for the marketplace, they will then be made available for funding via Other Transaction agreements or procurement contracts. 

“Thus, the Tradewind Solutions Marketplace serves as the DOD’s digital environment of post-competition, readily awardable, technology solutions,” the document said.

Video solution pitches will have to address one or more topics on a list of strategic focus areas that will also change over time based on the Pentagon’s needs.

Initially, those areas are: improving situational awareness and decision-making, increasing safety of operating equipment, implementing predictive maintenance and supply, streamlining business processes, assuring cybersecurity and discovering Blue Sky technology applications.

The latter essentially refers to future-facing domains where “real-world” applications are not immediately apparent. The CDAO’s press office did not provide further information by FedScoop’s deadline about what capabilities those involved want in that case.

The post DOD to roll out new online marketplace to speedily buy AI-aligned tech appeared first on FedScoop.

]]>
59673
Transportation Command developing new dashboard for better data fusion https://fedscoop.com/transportation-command-developing-new-dashboard-for-better-data-fusion/ Fri, 19 Aug 2022 01:11:48 +0000 https://fedscoop.com/?p=58510 The Global Mobility Nodal Posture Dashboard is expected provide a quick, real-time view of Transcom's workloads worldwide.

The post Transportation Command developing new dashboard for better data fusion appeared first on FedScoop.

]]>
SCOTT AIR FORCE BASE, Ill. — With support and resources from the Pentagon’s nascent Chief Digital and Artificial Intelligence Office (CDAO), U.S. Transportation Command is developing a new dashboard to advance data-informed decision-making and provide a better common operating picture.

Transcom is a functional combatant command responsible for the Defense Department’s integrated global mobility operations via land, air and sea in times of peace and conflict. The CDAO, established in December by Deputy Secretary of Defense Kathleen Hicks, was created to enable a stronger foundation for data analytics and AI-enabled capabilities to be developed and deployed at scale across the department.

During a visit to Transcom on Thursday, Hicks heard firsthand how information gleaned from the CDAO is enabling members of the command to better apply data and predictive analytics when carrying out its missions.

“This is about an end-to-end movement from a depot on a train or a truck to an airport or seaport to an airport or seaport — linking up with a theater sustainment command,” Transcom Commander Gen. Jacqueline Van Ovost said in a briefing with Hicks.

Hicks said: “This is where the data fusion has a significant amount of promise to give everybody a common picture book to be able to see ourselves and sort of plan ahead with what we see.”

Officials at Transcom are now developing a new tool called the Global Mobility Nodal Posture Dashboard to provide a quick, real-time view of the organization’s workloads worldwide.

“This is an upgrade from our current [assessment tool], which is only able to be updated weekly,” Brig. Gen. Charles Bolton, chief of the command’s global operations center, said during a briefing. “And the new dashboard will provide details and enable additional analysis compared to the current system.”

Additionally, the dashboard will have the ability to drill down into targeted combatant commands to also focus on what Transcom is providing to them.

“You could drill down to a specific airport or seaport to see what’s going on there and display the accusative efforts of individual force movements,” Bolton said.  

He and his team are working with CDAO components to gain access to datasets necessary to build their envisioned solution. Next steps will involve curating and fully making sense of all that data to provide the best, overarching views via the dashboard.

At Transcom’s global operations center, officials highlighted one of the key transportation hubs being used to send weapons from the United States to Europe for Ukrainian forces — and they displayed the posture, capacity and throughput at that individual node. New data analytics capabilities are expected to provide additional tools to support planning efforts for mobility operations.

“This rapid notification and nodal flexibility will be critical during contested logistics environments. While these functions and datasets are still in the works, due to the iterative development of the dashboard, we believe it will be extremely beneficial to Transcom and … our decision-makers in the future,” Bolton said.

The post Transportation Command developing new dashboard for better data fusion appeared first on FedScoop.

]]>
58510
AI task force for Navy surface fleet devising comprehensive data catalog https://fedscoop.com/ai-task-force-for-navy-surface-fleet-devising-comprehensive-data-catalog/ Wed, 03 Aug 2022 16:37:54 +0000 https://fedscoop.com/?p=57184 It’s one part of a broad, federated model the Naval Surface Force is applying to accelerate AI adoption.

The post AI task force for Navy surface fleet devising comprehensive data catalog appeared first on FedScoop.

]]>
A Navy task force formed to operationalize artificial intelligence and machine learning across the surface fleet is steering efforts to fix data and lay the foundation for associated emerging technology applications in the near future. 

Effective AI depends on data that is “clean” or cohesive enough to build algorithms off of. Naval Surface Force components, which equip and staff warships ahead of their deployments to respective fleet commands for military operations, create and use heaps of data — but currently, it’s all pretty messy from an organizational perspective. 

“Our data landscape is so vast and complex. There’s no common data ecosystem, no data catalog, and not enough clean data,” Task Force Hopper Director Capt. Pete Kim told FedScoop in an interview July 29.

Kim has led the task force since it was launched last summer to drive AI capabilities across the surface force. He and his team have made progress shaping a nascent approach, strategy and implementation plan to guide that wide-ranging effort. As those documents are now being prepared for release, the task force is also working to engineer and refine a digital and conceptual hub that makes sense of the organizations’ multitudes of data and helps personnel better analyze and apply it for AI and ML.

“As you can imagine, it’s pretty challenging to get this infrastructure right,” Kim said. “I think it’s because of the nature of our different security classifications, roles and environments. It’s not as easy as, like, getting an app on your iPhone and doing the updates quickly.”

‘Cracking that code’

Kim now heads both the Surface Analytics Group (SAG) and Task Force Hopper. The experience has been eyeopening.

Capt. Pete Kim, then commanding officer of Ticonderoga-class guided-missile cruiser USS Princeton (CG 59), uses the ship’s general announcing system to speak to the crew, Aug. 14, 2020. (U.S. Navy photo by Mass Communication Specialist 2nd Class Logan C. Kellums)

“I’ve always been in the operational fleet — so, the one providing the data — and I didn’t realize how much data we have in the Navy that is not exploited,” he explained. “I think we tend to look for the data that we need to answer the mail and things like that, but until the last several years, I don’t think we’ve had the capability to really process big data. And we’re doing that now. So, that’s probably the coolest part.”

Still, there’s a long way to go before the service’s ambitious aims of applying AI and ML on a large scale are completely realized.

An initial priority for Task Force Hopper is to help the surface force pinpoint and clean data, so the separate parts of the sprawling enterprise can collectively take full advantage of it. To help demonstrate the vast scope of information, Kim noted that the SAG concentrates on readiness-related data.

“The spectrum is having data from authoritative databases, where it’s very structured data and we’re pulling in all the unique datasets that we need — all the way down to being on someone’s desktop or on a shared drive hidden away somewhere that you’ve got to find the right person or the manager to get to that data,” he said.

In addition to challenges around data availability, quality and governance, Kim noted that technology-centered work in the Navy has traditionally been organized and structured based on platforms and supporting program offices. But AI development cuts across many different stovepipes and organizations. 

“That’s why this federated model is so key in cracking that code,” he explained. 

That nascent approach he alluded to was recently conceptualized by the task force and will be detailed in a soon-to-be-released data and AI strategy and implementation plan. The overarching idea is to have more centralized data governance and a one-stop data catalog — combined with “decentralized analytic and AI development nodes at different places in the enterprise,” where personnel know and use data best.

Each node will focus on certain categories associated with artificial intelligence and machine learning, like maintenance or lethality. The SAG, for instance, is considered an AI node focused on readiness.

“I think every node is going to be a little bit different, and that really depends on the problem set, the use case. And then, again, what’s the state of the data?” Kim said.

If nodes have high-quality, mature datasets, they’ll likely be developing AI models pretty quickly. But if they start near or from square one, they’ll probably have to spend more time on data collection, cleaning and labeling in the first portion of the journey.

“I know that data management is not the sexiest topic, but we do believe this is one of the significant leaps to accelerating AI and ML in a large organization,” Kim added.

Entering a new era

Task Force Hopper is named in homage to the trailblazing computer pioneer and formal Naval officer Grace Hopper, who reached the rank of rear admiral (lower half) before her retirement.

A group of key AI and data stakeholders across the surface force — one of the Navy’s largest enterprises — has been meeting on a biweekly basis over the past year or so. Kim said they’ve kicked off “that data governance process” and are identifying many datasets for their respective realms to prioritize. 

Crafting clearly-defined use cases for the surface force’s many sources of data is also presently top-of-mind for Task Force Hopper.

“When it comes to analytics and AI, we’re kind of entering a new era where you have to have the operator, the warfighter, or the maintainer involved in every step of the development,” Kim said. “I think this is a departure from the past where we just give requirements to some contractor and then they come back in two years with the product.” 

In his view, the task force and SAG are seeing success from “having the right subject matter experts sitting side by side with data scientists, with AI model developers to produce really valuable products.”

Task Force Hopper has also made headway in working with the Navy’s office of the chief information officer, according to Kim, to apply a platform called Advana-Jupiter as its common development environment. 

“It’s got data-warehousing tools and all the applications you need to visualize the data and create AI models,” he explained. “We’re using that platform as a place to have a single catalog so that if folks are working on a project and they’re looking for certain datasets to move forward, they’re not stalled because they can’t find it or it’s unavailable.”

As one, evolving piece of Advana, the Pentagon’s bigger enterprise data hub, Jupiter will enable surface force members to seamlessly access data — and then build AI and ML algorithms informed by it.

“On the readiness side, we’re looking towards predictive and prescriptive maintenance to sustain our ships and increase reliability at sea,” Kim said.

Another readiness node priority area is condition-based maintenance. “As we start employing unmanned surface vehicles, we’re going to need those types of CBM models to support those vessels at sea, since they won’t have maintenance personnel onboard,” Kim noted.

He added that while Jupiter does not need to host every single dataset, “that’s where we want to catalog it so that if someone’s working on a project, it’s like a menu” where they can see the point of contact and details on the data.

“We’re going to use Advana-Jupiter as that platform where we can kind of integrate different datasets, because as we start building more advanced AI models, it’s not just going to be one sensor data source, it’s going to be multiple things,” Kim said.

A key goal for the task force is to help the surface force become AI-ready by 2025. 

“I think with new technology, you always feel like you’re behind. That’s why we’re putting so much brainpower behind this. But as you know, having that high-quality dataset, the tools, the right people for the project — I mean that’s like 80% of the journey. So, if we get that infrastructure part right, the last 10% of producing this widget or what have you is the easy part,” Kim said. “And we can really partner with industry to really leverage the tech that’s out there and develop these unique tools that we need.”

The post AI task force for Navy surface fleet devising comprehensive data catalog appeared first on FedScoop.

]]>
57184
Naval Surface Force plan to accelerate AI adoption expected ‘in the next few weeks’ https://fedscoop.com/naval-surface-force-plan-to-accelerate-ai-adoption-expected-in-the-next-few-weeks/ Fri, 29 Jul 2022 21:54:12 +0000 https://fedscoop.com/?p=56923 A Navy official provided FedScoop with a preview of the forward-looking guide.

The post Naval Surface Force plan to accelerate AI adoption expected ‘in the next few weeks’ appeared first on FedScoop.

]]>
Naval Surface Force, U.S. Pacific Fleet has reached the final phase of drafting its first-ever, overarching data and artificial intelligence strategy and implementation plan — and aims to share those resources more broadly before this summer ends, a key task force leader told FedScoop on Friday.

Naval Surface Force organizations perform a variety of administrative, maintenance, workforce and operational training functions and help equip and staff Navy warships before they are deployed to the respective fleet commands for military missions. As one of the Navy’s largest enterprises, the organizations capture, produce and rely on massive amounts of data.

About a year ago, the Navy created Task Force Hopper to produce a complex digital infrastructure and cultural transformation to ultimately drive AI-enabled capabilities across the surface force. 

“We understand that AI is the most powerful decision engine for the readiness and sustainment of our ships and for warfighting — and we wanted to see, as an enterprise, how we can best accelerate this effort. This is about AI adaptation,” Task Force Hopper Director Capt. Pete Kim told FedScoop in an interview.

Kim has served in that role since the task force’s inception, and also leads the Surface Analytics Group. 

To him and his team, the task force represents “a broad enterprise approach about hiring the right digital talent, making sure our teams have the right development platforms for data excellent exploitation and creating new processes for the force.”

As such a sprawling enterprise, the surface force has many projects and initiatives within the Navy and in collaboration with academia and industry, resulting in siloed but duplicative efforts and gaps in transparency, among other issues. 

“We all know that AI is totally dependent on high-quality and accurate data. We believe that data management is the most important discipline for our era, and that’s what we wanted to focus on. So, one of our first initiatives was to draft a data AI strategy and implementation plan for the force to establish that structure,” Kim explained.

That in-the-making document will be unclassified, “but the audience is really the surface enterprise,” he added, suggesting that only a summary may be publicly disseminated.

“Right now, it is in the final stages of drafting. We hope to push that out to the enterprise here in the next few weeks,” the director said.

Broadly, the strategy and accompanying plan will detail why the Naval Surface Force is taking the approach that it is, and include directions for how officials will support the chosen framework.

“It really focuses on making data AI-ready across the board — and so it gets into data management, data governance, digital talent, ensuring that we’ve got clearly defined use cases,” Kim noted.

The strategy anchors on what he deemed a federated model with a number of different supporting organizations.

“It’s about central data governance, with a central data catalog and then having these decentralized analytic and AI development nodes at different places in the enterprise, where people know the data the best,” Kim said. 

While nodes at one military location, or associated with one team, might focus on maintenance, others could concentrate on staffing or other needs. Kim added that the nodes will hone in on different categories of AI, depending “on the use case and what products and models we’re trying to build.”

Offering two examples of nodes aligned with readiness, he pointed to the Surface Analytics Group that he runs, which assists with “the force generation of about 168 warships,” and a separate group that zeroes in on operational safety and risk indicators. 

“Readiness is our focus here at the [Naval Surface Force]. Program offices and warfare centers that are, let’s just say, focused on lethality or warfighting, we expect those organizations to have nodes working on their specific areas and use cases,” Kim said. “Then the follow on here is, again, that common development environment where we as an enterprise have transparency on all the different projects that are going on — and then we can leverage each other’s works.”

Once the strategy and implementation plan are released, next steps will prioritize empowering each of the AI nodes.

“These nodes are going to support all the priority projects. So, it’s going to be about, ‘Hey, do you have the right tools in this development environment? Do you have the right digital talent to move out on this? Are there different datasets that you need that the rest of the enterprise can help you out with?’ So, we’ll really focus in on the select nodes to support those priority projects,” Kim said.

The post Naval Surface Force plan to accelerate AI adoption expected ‘in the next few weeks’ appeared first on FedScoop.

]]>
56923
Cryptocurrencies compound federal efforts to curb federal fraud https://fedscoop.com/cryptocurrencies-compound-federal-efforts-to-curb-federal-fraud%ef%bf%bc/ Fri, 13 May 2022 12:35:00 +0000 https://fedscoop.com/?p=52052 FBI, Secret Service, USCIS and National Security Council officials describe the unprecedented scale of federal financial fraud, and data tactics to prosecute it.

The post Cryptocurrencies compound federal efforts to curb federal fraud appeared first on FedScoop.

]]>
The volume and velocity of criminal activity siphoning taxpayer dollars from federal programs, and the use of cryptocurrency to hide their efforts, have reached stunning levels, especially during the COVID-19 pandemic, a group of high-level federal officials said at a law enforcement and public safety technology forum this week.

“After 30 years of law enforcement, and 20 years of [investigating] complex fraud, I’ve never seen anything of this magnitude,” said National Pandemic Fraud Recovery Coordinator Roy Dotson, special agent in charge at the U.S. Secret Service.

Dotson continued: “Obviously, the magnitude was high, but [we saw] just how pervasive it was when there were YouTube tutorials telling you how to apply for unemployment — if you live in Florida, how you can get it in Washington; or get a [Paycheck Protection Program] loan if you don’t have a business,” he said. “I don’t know how many people I talked to that just said, ‘I thought it was free money.’”

While he credited Congress and the federal government for trying to help Americans by trying to get relief funds out quickly during the pandemic, he stated plainly, “the vetting process wasn’t ready for it.”

Carole House, the White House National Security Council’s director for cybersecurity and secure digital innovation, echoed those observations, citing government estimates “north of $100 billion in pandemic fraud. The scale is devastating,” she said.  House, who previously served as senior cyber and emerging tech policy officer at the Department of Treasury’s Financial Crimes Enforcement Network (FinCEN), said she was stunned by “the egregious amount of fraud” she saw, with people using federal relief funds to buy million-dollar homes.

“The tactic that interested me the most was the targeting of virtual meetings,” she told an audience of more than 250 federal, state and local law enforcement and public safety officials attending an AFCEA Bethesda forum at the National Press Club. “Criminals were compromising virtual platforms, using deep fakes to basically put up an image of the CEO to get to the person controlling the finances, [directing them] to send some payment to criminally controlled accounts.”

Following the money

Having the tools and manpower to follow the money has grown increasingly important for law enforcement agencies, both because of the dramatic growth in digital evidence that must be collected and because of the rising reliance on cryptocurrencies to launder the flow of money, said Steven D’Antuono, assistant director in charge at the FBI.

Criminals are deceiving people into “putting money in normal bank accounts, and then immediately siphoning it off into crypto, which we can trace eventually, it just takes a lot more work for us to do,” said D’Antuono. “And if anyone knows anything about the financial kill chain, we need to know the information quickly, so that we can go through SWIFT,” the global financial messaging service, to intercept the funds. Criminals are transferring from one coin to another to launder their money, “so there are a lot of challenges trying to trace those illicit funds,” he said.

“Even your most basic investigations today utilize cryptocurrency,” noted Dotson, saying converting cash to cryptocurrency is occurring on the smallest white-collar crimes.

At the same time, the expanded use of digital analytics, the cloud and new tools for spotting aberrations in financial transactions is giving federal law enforcement officials “a whole new mechanism for [identifying] vulnerabilities that are being exploited,” said House. But it’s also helping law enforcement investigators identify “members for disruption and potentially for attribution and understanding what the transnational criminal networks are popping up in these ecosystems,” she explained.

The fact that the White House is standing up an interagency COVID-19 task force on pandemic relief fraud, and designating a special prosecutor, are signs of how seriously federal officials view the issue of pandemic relief fraud, House said.

She also revealed that the White House is “actively working on” an executive order expected to be published “in the coming weeks” aimed at combatting and preventing “identity theft and fraud in government relief programs and other government benefit programs.”

Speaking to the topic of public safety, Matthew Emrich, associate director of fraud detection and national security at the U.S. Citizenship and Immigration Services, noted that fraud is also continuing to occur on a human dimension, involving “a large number of perpetrators and a large number of victims,” he said. “Some people see it as a victimless,” or just another version of white-collar crime, “but it can involve some pretty serious consequences for people.”

The post Cryptocurrencies compound federal efforts to curb federal fraud appeared first on FedScoop.

]]>
52052
Postal inspectors’ digital intelligence team sometimes acted outside of legal authorities, report says https://fedscoop.com/postal-inspectors-digital-intelligence-team-sometimes-acted-outside-of-legal-authorities-report-says/ Wed, 30 Mar 2022 17:40:37 +0000 https://fedscoop.com/?p=49669 USPS's internet analytics team occasionally used open-source intelligence tools beyond postal inspectors' law enforcement authorities, according to a watchdog.

The post Postal inspectors’ digital intelligence team sometimes acted outside of legal authorities, report says appeared first on FedScoop.

]]>
An internet intelligence and analytics support team for postal inspectors overstepped its legal authority in some cases, according to the inspector general for the U.S. Postal Service.

The Analytics Team, known until April 2021 as the Internet Covert Operations Program (iCOP), occasionally used open-source intelligence tools beyond the Postal Inspection Service’s legal authorities, and its record-keeping about some of that activity was inadequate, according to the March 25 report by the Office of the Inspector General for the USPS.

As part of their work assisting postal inspectors, the analysts conducted “proactive searches” for publicly available information online that could help root out postal crimes, the report says, but in some cases they used keywords that did not have a “postal nexus” — that is, “an identified connection to the mail, postal crimes, or the security of Postal Service facilities or personnel.”

Postal inspectors told the IG’s office that the keywords — such as “attack” or “destroy” — were meant to provide broad searches that could then be narrowed to a postal nexus. The IG report says the Postal Service’s Office of Counsel should have been more involved in vetting those search terms. Yahoo News first reported on the existence of iCOP in April 2021.

The IG office said it looked at a sampling of cases in early 2021 to reach its conclusions about the keywords. For other areas, it reviewed information available from October 2018 through June 2021. The report says it reviewed 434 instances where postal inspectors asked for analytical support from the team. Most of those — 72 percent — had a postal nexus.

The IG’s office also said postal inspectors should do more to document the process for requests made of the Analytics Team.

Leaders of the Postal Inspection Service said they “strongly disagree” with the specifics of the report, pointing to examples in federal case law that support its use of the Analytics Team and broadly authorize the kinds of activities cited by the IG’s office.

The IG’s office, in turn, noted that postal inspectors have agreed to many of the report’s recommendations for how the inspector-in-charge for analytics and the Inspection Service’s chief counsel can clarify the process for usage of open-source intelligence and bolster the record-keeping for those tasks.

“Therefore, the OIG considers management’s comments generally responsive to the recommendations in the report,” the IG’s office said.

The report lists several contracts that postal inspectors have with providers of open-source intelligence tools, but redacts the names of specific companies. Those activities include:

• Cryptocurrency blockchain analysis.
• Tools for gathering information about internet protocol (IP) addresses.
• Facial recognition tools.
• Monitoring social media for certain keywords.
• Searching social media for information about individuals.

As the IG’s report notes, the Analytics Team is part of the Postal Inspection Service’s Analytics and Cybercrime Program, which “provides investigative, forensic, and analytical support to field divisions and headquarters.”

Postal inspectors are sometimes involved in high-profile cybercrime cases, such as takedowns of dark web markets where customers pay in cryptocurrency for illicit goods that are then shipped through the mail.

The post Postal inspectors’ digital intelligence team sometimes acted outside of legal authorities, report says appeared first on FedScoop.

]]>
49669
Why data analytic platforms hold the key to smarter cloud investments https://fedscoop.com/why-data-analytic-platforms-hold-the-key-to-smarter-cloud-investments/ Wed, 23 Feb 2022 20:30:49 +0000 https://fedscoop.com/?p=47943 Platforms that observe and analyze data across hybrid cloud environments can help agencies better decide what information should remain on-prem or move to the cloud.

The post Why data analytic platforms hold the key to smarter cloud investments appeared first on FedScoop.

]]>
As federal IT leaders continue to assess how best to manage their data and applications in the cloud and on-premises, many must still confront a deeper challenge, says a new report: The need to establish an enterprise-wide view and understanding of their data.

Read the full report.

Having a comprehensive federal data strategy involves more than cataloging what data your agency has, which data is most valuable and where it resides. It also requires the ability to gather and analyze operational and security data in real time — and then have the additional ability to discern how various types of data are being put to work over their lifetimes.

“The world is moving to a place where there is too much data coming at us all day, every day,” says Juliana Vida, group vice president and chief strategy adviser at Splunk. A former deputy CIO at the Pentagon and retired U.S. Navy commander, Vida argues that despite the challenges of capitalizing on cloud services, agencies have “no other option” but to move to the cloud.

“There is no other way to manage the volume, velocity, variety, and pace without leveraging cloud technology. So the end state has to be figuring out how and when to leverage these mature data analytics capabilities that are optimized in the cloud,” she says.

But without a foundational data strategy upfront — and the tools to develop and foster that strategy — deciding what to move the cloud becomes even harder than it already is, she says in the report, “Why data analytic platforms hold the key to smarter cloud investments.”

Vida, and others in the report, maintain that organizations need to move beyond rationalizing applications and data centers in the name of efficiency. Instead, they need to adopt a platform approach that has the capability to gather, unify, analyze and act on data from all types of systems across an organization, including data operating in the cloud.

“The right platforms help you identify which data you’re actually using, which applications you actually need … and take the human effort out of it to figure out what’s important,” she explains.

Without a fully informed data strategy, organizations run the risk of transferring workloads to the cloud only to lose out on the potential insights and value the cloud can offer, says Geoff Woollacott, a senior strategy consultant and principal analyst at Technology Business Research.

“Cloud solutions alone will not deliver data clarity,” adds Dion Hinchcliffe, vice president and principal analyst at technology research and advisory firm Constellation Research. “In fact, they may create even less clarity because the data may be more dispersed.” That’s also in part because cloud providers only see a portion of a customer’s data.

The report cites findings from a recent Harvard Business Review Analytics Services study that found 66% of executives polled globally say that leveraging real-time data analytics, is “very or extremely important to monitoring and gaining insights across cloud services, applications and infrastructure.”

The report, underwritten by Splunk, highlights how Splunk’s Data-to-Everything Platform was instrumental in helping the U.S. Census Bureau unify and analyze data across the Census Bureau’s 52 systems and 35 operating divisions onto a single platform. That effort was part of a sweeping initiative to overhaul the bureau’s legacy systems and build a cloud-enabled IT environment in time for the 2020 census.

The report also highlights how Splunk’s Security Orchestration, Automation and Response (SOAR) system can support hundreds of tools and thousands of unique APIs, enabling IT teams to coordinate complex workflows.

“Platforms aren’t just a solution for putting your data into a cloud,” she Vida. “It’s being able to see across the entire lifecycle of the data and where it’s being used to help inform these decisions about migration and where to place investments — and where to pivot from what we used to do, to what we want to do. It offers end-to-end visibility of the data. And not all platforms do that.”

Vida also describes how establishing real-time observability puts federal agencies in a stronger position to achieve four longer-term benefits including greater efficiency, resiliency, security and innovation that provide added value to their investment strategies.

Download and read the full report.

 This article was produced by FedScoop and sponsored by Splunk.

The post Why data analytic platforms hold the key to smarter cloud investments appeared first on FedScoop.

]]>
47943