Innovation Edge Use Case Series Archives | FedScoop https://fedscoop.com/tag/innovation-edge-use-case-series/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Mon, 06 May 2024 14:00:12 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Innovation Edge Use Case Series Archives | FedScoop https://fedscoop.com/tag/innovation-edge-use-case-series/ 32 32 How the State Department used AI and machine learning to revolutionize records management https://fedscoop.com/how-the-state-department-used-ai-and-machine-learning-to-revolutionize-records-management/ Thu, 16 May 2024 19:34:00 +0000 https://fedscoop.com/?p=77770 A pilot approach helped the State Department streamline the document declassification process and improve the customer experience for FOIA requestors.

The post How the State Department used AI and machine learning to revolutionize records management appeared first on FedScoop.

]]>
In the digital age, government agencies are grappling with unprecedented volumes of data, presenting challenges in effectively managing, accessing and declassifying information.

The State Department is no exception. According to Eric Stein, deputy assistant secretary for the Office of Global Information Services, the department’s eRecords archive system currently contains more than 4 billion artifacts, which includes emails and cable traffic. “The latter is how we communicate to and from our embassies overseas,” Stein said.

Over time, however, department officials need to declare what can be released to the public and what stays classified — a time-consuming and labor-intensive process.

Photo of Eric Stein, U.S. State Department Eric Stein, deputy assistant secretary for the Office of Global Information Services,
Eric Stein, Deputy Assistant Secretary, Office of Global Information Services, U.S. Department of State

The State Department has turned to cutting-edge technologies like artificial intelligence (AI) and machine learning (ML) to find a more efficient solution. Through three pilot projects, the department has successfully streamlined the document review process for declassification and improved the customer experience when it comes to FOIA (Freedom of Information Act) requests.

An ML-driven declassification effort

At the root of the challenge is Executive Order 13526, which requires that classified records of permanent historical value be automatically declassified after 25 years unless a review determines an exemption. For the State Department, cables are among the most historically significant records produced by the agency. However, current processes and resource levels will not work for reviewing electronic records, including classified emails, created in the early 2000s and beyond, jeopardizing declassification reviews starting in 2025.

Recognizing the need for a more efficient process, the department embarked on a declassification review pilot using ML in October 2022. Stein came up with the pilot idea after participating in an AI Federal Leadership Program supported by major cloud providers, including Microsoft.

For the pilot, the department used cables from 1997 and created a review model based on human decisions from 2020 and 2021 concerning cables marked as confidential and secret in 1995 and 1996. The model uses discriminative AI to score and sort cables into three categories: those it was confident should be declassified, those it was confident shouldn’t be declassified, and those that needed manual review.

According to Stein, for the 1997 pilot group of more than 78,000 cables, the model performed the same as human reviewers 97% to 99% of the time and reduced staff hours by at least 60%.

“We project [this technology] will lead to millions of dollars in cost avoidance over the next several years because instead of asking for more money for human resources or different tools to help with this, we can use this technology,” Stein explained. “And then we can focus our human resources on the higher-level and analytical thinking and some of the tougher decisions, as opposed to what was a very manual process.”

Turning attention to FOIA

Building on the success of the declassification initiative, the State Department embarked on two other pilots to enhance the Freedom of Information Act (FOIA) processes from June 2023 to February 2024.

Like cable declassification efforts, handling a FOIA request is a highly manual process. According to Stein, sometimes those requests are a single sentence; others are multiple pages. But no matter the length, a staff member must acknowledge the request, advise whether the department will proceed with it, and then manually search for terms in those requests in different databases to locate the relevant information.

Using the lessons learned from the declassification pilot, Stein said State Department staff realized there was an opportunity to streamline certain parts of the FOIA process by simultaneously searching what was already in the department’s public reading room and in the record holdings.

“If that information is already publicly available, we can let the requester know right away,” Stein said. “And if not, if there are similar searches and reviews that have already been conducted by the agency, we can leverage those existing searches, which would result in a significant savings of staff hours and response time.”

Beyond internal operations, the State Department also sought to improve the customer experience for FOIA requesters by modernizing its public-facing website and search functionalities. Using AI-driven search algorithms and automated request processing, the department aims to “find and direct a customer to existing released documents” and “automate customer engagement early in the request process.”

Lessons learned

Since launching the first pilot in 2022, team members have learned several things. The first is to start small and provide the space and time to become familiar with the technology. “There are always demands and more work to be done, but to have the time to focus and learn is important,” Stein said.

Another lesson is the importance of collaboration. “It’s been helpful to talk across different communities to not only understand how this technology is beneficial but also what concerns are popping up—and discussing those sooner than later,” he said. “The sooner that anyone can start spending some time thinking about AI and machine learning critically, the better.”

Another lesson is to recognize the need to “continuously train a model because you can’t just do this once and then let it go. You have to constantly be reviewing how we’re training the model (in light of) world events and different things,” he said.

These pilots have also shown how this technology will allow State Department staff to better respond to other needs, including FOIA requests. For example, someone may ask for something in a certain way, but that’s not how it’s talked about internally.

“This technology allows us to say, ‘Well, they asked for this, but they may have also meant that,’” Stein said. “So, it allows us to make those connections, which may have been missing in the past.”

The State Department’s strategic adoption of AI and ML technologies in records management and transparency initiatives underscores the transformative potential of these tools. By starting small, fostering collaboration and prioritizing user-centric design, the department has paved the way for broader applications of AI and ML to support more efficient and transparent government operations.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.  To learn more about AI for government from Microsoft, sign up here to receive news and updates on how advanced AI can empower your organization.

The post How the State Department used AI and machine learning to revolutionize records management appeared first on FedScoop.

]]>
77770
Streamlining aid delivery: Lessons from SBA’s digital modernization journey https://fedscoop.com/streamlining-aid-delivery-lessons-from-sbas-digital-modernization-journey/ Mon, 29 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77657 How the Small Business Administration’s pivot to a cloud-based CRM platform helped it navigate through the pandemic and transform its approach to customer service.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
America’s more than 32 million small businesses play an indispensable role in driving the U.S. economy. Small businesses account for 43.5% of gross domestic product, employ 61.7 million workers and generate payrolls topping $2.9 trillion, according to government data.

In March 2020, as COVID-19 emerged as a global threat, it became apparent that millions of small businesses were headed into economic peril. While White House officials and lawmakers moved with unusual speed to enact the Coronavirus Aid, Relief and Economic Security (CARES) Act, the task of administering financial aid to small businesses suddenly fell on the U.S. Small Business Administration (SBA).

Legacy challenge

As an independent cabinet agency with fewer than 3,000 employees, the SBA had, until then, managed small business loan and grant applications using an email-based processing and approval system involving shared mailboxes built on Microsoft Outlook. The agency’s outdated backend infrastructure had never been designed — and was ill-equipped — to handle the overwhelming volume of relief requests flooding in from all 50 states once the CARES Act was enacted. Inboxes and storage capacities hit their daily caps almost immediately. Customers started to receive “undeliverable” messages. And SBA employees were unable to keep up with the skyrocketing workloads.

Photo of Brian Quay, SBA Program Manager
Brian Quay, SBA Program Manager

SBA’s leadership quickly recognized what many other public and private sector organizations discovered at the onset of the pandemic — to remain effective in an environment of rapidly escalating and fast-changing needs, they needed to transition quickly from their existing operating systems and adopt a more modern and scalable digital solution that could meet their rapidly-changing needs.

Transformative solution

SBA officials turned to a cloud-based customer relationship management (CRM) platform, Microsoft Dynamics 365. The platform not only offered the scalability and customization the SBA needed but also allowed the SBA to implement a wide range of integrated features, including email automation, auto-routing, metrics recognition, storage optimization, spam prevention, app integration, and auditing capabilities.

More fundamentally, the shift to a modern CRM platform enabled the SBA to transition from a series of manual, labor-intensive processes to a more efficient, automated system that could quickly scale to the volume SBA needed.

Improved outcomes

Adopting a modern, cloud-based CRM platform not only helped SBA overcome a host of technology bottlenecks but also resulted in significant improvements in the SBA’s internal operations and customer service. The platform:

  • Centralized all customer interactions and attached documents into a single contact record, saving a significant amount of time previously spent verifying that all required documents had been received.
  • Categorized requests and automated routing, resulting in timelier responses and fewer requests left in limbo.
  • Reduced much of the manual work associated with evaluating requests and eliminated common processing errors, enhancing productivity.
  • Allowed SBA staff to more quickly triage cases and review work origins, notes, updates, and activities that had occurred across multiple teams for faster response.
  • Provided customers with an easier way to submit a standardized inquiry using a convenient web form on my.sba.gov, built on Dynamics 365 Portal, rather than typing out an email. Customers can also schedule appointments through a Microsoft Power Pages Portal (appointment.sba.gov), assigned to SBA staff and fulfilled within the Dynamics 365 Customer Service platform.

By making it easier to integrate apps and implement a knowledge base reference library, the modernization effort also allowed the SBA to consolidate information from various sources and streamline the decision-making process. That effort was further enhanced with the creation of a Tier 1 dashboard for individual users and a Tier 2 dashboard for team leads to track overall caseloads, giving SBA staff the ability to make data-driven decisions faster and adapt to changing circumstances.

Mission modernization

Moving to a scalable, cloud-based CRM platform helped the SBA rally quickly in response to the sudden flood of aid requests. It also catapulted the SBA’s ability to meet its broader mission of serving and supporting small businesses.

In particular, the new platform made it possible for the SBA to manage activities more effectively with — and gain deeper insights about — more than 11 million individuals in its contact list.

“We can come to the campaign tabs [on the dashboard] and see a list of all of the different campaigns that the SBA has created inside of the platform,” explained SBA Program Manager Brian Quay. The software allows SBA staff to roll up all the cases associated with a contact record and even view image files to validate what information has been provided. It also allows SBA staff to see the status and performance of various marketing campaigns and activities.

“We can see…the number of members that were on [a particular] marketing list, how many messages were successfully sent to them, and failures. This is something that has been a huge productivity gain for SBA [staff], who were previously mainly sending those emails out through Outlook without an ability to track success,” the official said. Altogether, the platform helped SBA create and send more than 50 million templated outreach email from February to September 2023.

Another dimension of the SBA’s customer service modernization is the implementation of Power BI dashboards natively embedded into Dynamics 365. This allows executives who aren’t trained to use Dynamics to still access the metrics it provides by leveraging Power BI on the web or their mobile devices.  

Within two and a half years, the SBA expanded the platform from four mailboxes to over 200 individual inboxes, used by close to 80 teams with an unprecedented volume of activity. According to recent estimates, the platform now tracks over 20 million cases to date and has resulted in operational cost savings of over $25 million.

Lessons learned

The SBA’s transition from an email-based tracking system to a cloud-based CRM platform yielded several valuable lessons for federal executives considering a similar transformation:

Firstly, the importance of scalability cannot be overstated. In a crisis situation, the ability to quickly scale up operations is crucial, and a flexible digital platform can make all the difference.

Secondly, customization matters. Tailoring the system to the agency’s unique needs ensures maximum efficiency and usability.

Thirdly, integration capabilities are a game-changer. The ability to connect different tools and data sources creates a unified ecosystem, enabling faster decision-making.

Lastly, automation is a key enabler of efficiency. By automating routine tasks, agencies can focus their efforts on high-impact activities and respond swiftly to emerging challenges.

The Small Business Administration’s journey to digital modernization also demonstrates that in a rapidly evolving world, embracing innovative solutions is not just an option; it’s a necessity to empower organizations to thrive, grow, and support those they serve.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
77657
How NIH’s National Library of Medicine is testing AI to match patients to clinical trials https://fedscoop.com/how-nihs-national-library-of-medicine-is-testing-ai-to-match-patients-to-clinical-trials/ Mon, 15 Apr 2024 20:56:03 +0000 https://fedscoop.com/?p=77082 A team at the National Institutes of Health’s National Library of Medicine is using large language models and AI to help researchers find candidates for clinical trials.

The post How NIH’s National Library of Medicine is testing AI to match patients to clinical trials appeared first on FedScoop.

]]>
Few organizations in the world do more to turn biomedical and behavioral research into better health than the National Institutes of Health, its 27 institutes and centers and more than 18,000 employees.

One of those institutes is the National Library of Medicine (NLM). Considered the NIH’s data hub, NLM’s 200-plus databases and systems serve billions of user sessions every day. From PubMed, the premier biomedical literature database, to resources like Genome  and ClinicalTrials.gov, NLM supports a diverse range of users, including researchers, clinicians, information professionals and the general public.

Photo of Dianne Babski, Director, User Services and Collection Division, NLM
Dianne Babski, Director, User Services and Collection Division, NLM

With so many users coming to its sites looking for a variety of information, NLM is always looking for new ways to enhance its products and services, according to Dianne Babski, Director of the User Services and Collection Division. NLM has been harnessing emerging technologies for many years but was quick to see how generative AI and large language models (LLMs) could potentially make its vast information resources more accessible to improve discovery.

Focus on innovation

“We’ve jumped into the GenAI [AI] arena ,” Babski said. “Luckily, we work in a very innovative institute, so staff were eager to play with these tools when they became accessible.” Through the Science and Technology Research Infrastructure for Discovery, Experimentation, and Sustainability (STRIDES) initiative, NIH researchers have access to leading cloud services and environments.

For her part, Babski is leading a six-month pilot project across NLM focused on 10 AI use cases using GenAI. The use cases are divided into five categories: product efficiency and usage, customer experience, data and code automation, workflow bias reduction, and research discovery.

NLM chart of 10 GenAI initiatives.
National Library of Medicine GenAI Initiatives (NLM)

The participating cloud service providers gave NIH access to a “firewalled, safe environment to play in, we’re not in an open web environment,” Babski explained. As part of this pilot program, NLM is also providing feedback on the user interface that it’s been creating for one of the provider’s government enterprise system.

Reducing recruitment challenges in clinical trials

One use case with potentially significant implications focuses on the work in ClinicalTrials.gov. Researchers, clinicians and patients use this NLM database to search for information about clinical research studies worldwide.

While clinical trials are pivotal for advancing medical knowledge and improving patient care, one of the most significant challenges in conducting them is patient recruitment. Identifying suitable candidates who meet specific study criteria is a time-consuming and resource-intensive process for researchers and clinicians, which can hamper the progress of medical research and delay the development of potentially lifesaving treatments.

Recognizing the need to streamline clinical trial matching, NLM created a prototype called TrialGPT. Using an innovative LLM framework, TrialGPT is designed to predict three elements of patient eligibility for clinical trials based on several criteria. It does so by processing information from patient notes to generate detailed explanations of eligibility, which are then aggregated to recommend appropriate clinical trials for patients.

Early results have demonstrated TrialGPT can accurately explain patient-criterion relevance and effectively rank and exclude candidates from clinical trials. However, two challenges were also noted, according to an agency brief: the model’s lack of intrinsic medical knowledge and its limited capacity for medical reasoning.

To address these challenges, the NLM project team plans to augment LLMs with specialized medical knowledge bases and domain-specific tools.

Babski said implementing TrialGPT has the potential to deliver a more efficient and accurate method for matching patients to trials. “While currently only available as a research prototype, we see its potential as a great resource for clinicians to help find patient participants for these different types of trials,” she said.

Lessons learned

As NLM continues to pioneer and experiment with AI-driven use cases like TrialGPT, Babski said several vital recommendations and lessons have emerged. “One of the biggest things I’ve taken away from this is that it’s way more work and complicated than you think it’s going to be,” she said.

For instance, there is a steep learning curve for people to get comfortable with these new tools. But at the same time, that process also allows participants to develop new technical skills, such as running Python code and working in notebook environments.

Effective collaboration and interdisciplinary teamwork are also essential. According to Babski, the pilot program has been successful because NLM was able to not only assemble a “dream team” of domain experts, data scientists, and engineers but also established a community across NIH—currently more than 500 people strong—that is energized and motivated to share their work and support one another. “Everyone has a interesting use case and they are rolling up their sleeves, and trying to figure out how to work with GenAI to solve real work problems,” she said.

Babski also follows a checklist of goals to be applied to any Generative AI pilot:

  • Experiment and develop best practices for LLMs in a safe (behind the firewall) “playground” environment.
  • Create a proof of concept that applies to the agency’s work.
  • Measure results to ensure utility and safety (e.g. NIST guidelines).
  • Develop workforce skills in generative AI.

For other agencies and organizations looking to explore the potential of AI technologies, Babski shared that it’s essential to embrace a culture of adaptability. “You have to be OK with pivoting halfway through,” she said. “We were trying to do data visualization work, and we just realized that this isn’t the right environment for what we were attempting, so we pivoted the use case.”

Ultimately, NLM’s use cases, including TrialGPT, highlight the transformative impact of GenAI and cloud-based platforms on healthcare innovation. By leveraging these technologies, NLM is likely to improve future healthcare delivery and patient outcomes globally.

Editor’s note: This piece was written by Scoop News Group’s content strategy division.

The post How NIH’s National Library of Medicine is testing AI to match patients to clinical trials appeared first on FedScoop.

]]>
77082
How cloud modernization transformed OPM cybersecurity operations https://fedscoop.com/how-cloud-modernization-transformed-opm-cybersecurity-operations/ Tue, 27 Feb 2024 20:27:00 +0000 https://fedscoop.com/?p=76126 By shifting to cloud-native solutions, the U.S. Office of Personnel Management has significantly enhanced its underlying security infrastructure to better protect the agency from evolving cyber threats.

The post How cloud modernization transformed OPM cybersecurity operations appeared first on FedScoop.

]]>
Few organizations in the world provide human resource services at the scale of the U.S. Office of Personnel Management (OPM). OPM oversees personnel management services for 2.2 million federal workers — and the retirement benefits for another 2.7 million annuitants, survivors, and family members. Because the agency also manages the federal workforce’s recruiting, hiring, and benefits management, OPM is responsible for handling vast amounts of sensitive data, making it a prime target for cyberattacks. 

Following a massive data breach in 2015, OPM instituted a comprehensive overhaul of its IT and security practices. However, in the years since, it became increasingly clear that without modernizing its underlying IT infrastructure, many of the remedies OPM put in place were becoming outmoded in the face of ever more sophisticated cyberattacks.

That was especially apparent to Guy Cavallo, who arrived at OPM in the fall of 2020 as principal deputy CIO after leading sweeping IT modernization initiatives at the Small Business Administration (SBA) and before that at the Transportation Security Administration (TSA). He was named OPM’s CIO in July 2021.

Recognizing new cyber challenges

“We looked at the on-premises cyber tools that OPM was running since the breach and saw while they were effective, with today’s advancements in AI and cyber capabilities, they weren’t keeping up with the attack vectors we’re facing today,” said Cavallo in a recent interview. Threat actors had shifted to identity-based attacks using more sophisticated tactics, requiring advanced detection and response solutions.

Guy Cavallo, CIO, OPM

“We knew with AI coming and the Executive Order on Cybersecurity requiring logging to get visibility into your environment, investing in on-premises hardware would be a never-ending battle of running out of storage space,” he concluded.

The cloud was “the ideal elastic storage case for that,” he continued. But it also offered other critical solutions. The cloud was the ideal way to host applications to ensure “that we’re always up to date on patching and versions, leaving that to the cloud vendors to take care of — something that the federal government struggles with,” he said.

Checklist for a better solution

Cavallo wanted to avoid the mistake he had seen other organizations make, trying to weave all kinds of tools into an enterprise security blanket. “It’s incredibly difficult to integrate them and not have them attack each other — or also not have gaps between them,” he said. “I’m a believer that simpler is much better than tying together best-of-breed from multiple vendors.”

James Saunders, CISO, OPM

That drove Cavallo and OPM Chief Information Security Officer James Saunders to pursue a fundamental shift to a cloud-native cybersecurity platform and “making that the heart of our security apparatus,” said Saunders.  

After reviewing the options, they elected to move to Microsoft’s Azure cloud-based cybersecurity stack “so that we can take advantage of the edge of cloud, and cloud in general, to collect data logs.” Additionally, it would mean “We didn’t have to worry about software patching and ‘Do I have enough disk space?’ It also allows us to springboard into more advanced capabilities such as artificial intelligence,” Saunders said.

Because OPM exchanges data with many federal agencies that rely on different data systems, Cavallo and Saunders also implemented a cloud access security broker (CASB) — a security policy enforcement engine that monitors and manages security activity across multiple domains from a single location. It also “enables our security analysts to be more efficient and identify threats in a more holistic manner,” Saunders explained.

Added benefits

“There is a general misconception that you can only use cloud tools from the host vendor to monitor and protect that environment.  We found that leveraging cyber defenses that span multiple clouds is a better solution for us instead of having multiple different tools performing the same function,” Cavallo added.

Microsoft’s extensive threat intelligence ecosystem and the ability to reduce the number of contracts OPM has to maintain were also critical factors in their decision to move to Azure, Saunders added.

The pay-off

The migration from on-premises infrastructure to the cloud was a complex process involving the retirement of more than 50 servers and the decommissioning of multiple storage areas and SQL databases, according to Saunders. The most challenging aspect, though, was not the technology but managing the transition with the workforce. Extensive training and organizational change management were as critical as the technical migration to the success of the transition.

According to Saunders, the benefits didn’t take long to recognize:

  • Enhanced visibility: OPM now has a more comprehensive view of its security posture, thanks to the centralized platform and increased log collection.
  • Improved threat detection and response: AI-powered tools and Microsoft’s threat intelligence helps OPM identify and respond to threats faster and more effectively.
  • Reduced costs and complexity: Cloud-native solutions eliminate the need for buying expensive on-premises hardware and software, while also simplifying management and maintenance.
  • Increased scalability and agility: The cloud platform allows OPM to easily scale its security infrastructure as needed to meet evolving threats and business requirements.

Collectively, those and related cloud benefits are also helping OPM make faster headway in meeting the administration’s zero-trust security goals.

Lessons learned

Perhaps one of the most important benefits is being able to demonstrate the magnitude and nature of today’s threat landscape to the agency’s leadership and how OPM is much better prepared to defend against it, according to Cavallo.

“When James and I showed them the visibility that we have from all those logs, it was a drop-the-mic moment for them. We can say we blocked 4,000 attacks in the last hour, but until you actually show them a world map and our adversaries trying to get into OPM, then be able to click and show the real details of it — those threats get lost in the noise,” he said.

“My recommendation at the CIO level is, this is a better mousetrap. But you can’t just expect people to flock to it. You have to go show them why it’s a better mousetrap.”

Among the other lessons Cavallo recommends to fellow IT leaders:

  • Focus on simplicity: Choose a single, integrated security platform to avoid the complexity of managing multiple tools.
  • Invest in training: Ensure your staff is trained and familiar with new cloud-native security tools and processes.
  • Start small and scale gradually: Begin with a pilot project and gradually migrate your security infrastructure to the cloud.
  • Communicate effectively: Clearly explain the benefits of cloud-native security to your stakeholders and address any concerns.

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How cloud modernization transformed OPM cybersecurity operations appeared first on FedScoop.

]]>
76126
How cloud modernization helps FERC streamline its regulatory processes https://fedscoop.com/how-cloud-modernization-helps-ferc-streamline-its-regulatory-processes/ Mon, 29 Jan 2024 20:30:00 +0000 https://fedscoop.com/?p=75709 A novel tech-challenge approach helped IT leaders at the Federal Energy Regulatory Commission start the overhaul of its legacy applications and improve customer service.

The post How cloud modernization helps FERC streamline its regulatory processes appeared first on FedScoop.

]]>
Upgrading and expanding the nation’s electrical grid isn’t just about meeting America’s growing electrical demands; it’s about powering our economy, securing national security, and ensuring a cleaner future for future generations. But because so much of that infrastructure lies mainly in private hands, orchestrating that effort requires extraordinary attention.

That is one of the roles of the Federal Energy Regulatory Commission (FERC), an independent agency within the Department of Energy that regulates the interstate transmission of electricity, natural gas, and oil. FERC also reviews and licenses liquified natural gas terminals, hydropower projects, and pipelines.

Ensuring that the companies building and operating power plants, pipelines and transmission lines adhere to safety standards, comply with environmental laws, and abide by market-based pricing guidelines requires an extensive review and approval process. And because FERC relies on approximately 1,570 employees to perform that work, technology plays a critical role in keeping on top of all those entities’ requests.

The challenge: Legacy technology

Michelle Pfeifer, Director of Solutions Delivery and Engineering, FERC.

FERC’s technology systems, however, like those at many federal agencies, have been hard-pressed to keep up with ongoing and emerging demands. Most of those systems used to manage the core applications the agency depends on are more than ten years old and stove-piped, according to Michelle Pfeifer, Director of Solutions Delivery and Engineering.

Among other challenges, the workload management systems used to process and manage filings from regulated entities operate on outdated, customized platforms, leading to inefficiencies in tracking and managing the significant number of filings the agency must handle, said Pfeifer, who joined FERC four years ago. “We have done some updates, but there are a significant number of requests for refresh or modernization that have not been addressed. Additionally, data had to be entered into multiple systems, compounding workload challenges,” she said.

The search for a better solution

FERC’s IT team recognized the solution required more than a technology refresh. So they decided to launch an “application layer modernization program to address pent-up demand from our customers, address the stovepipe nature of multiple applications, and do it more quickly and flexibly through an agile delivery process. And we definitely wanted a cloud-based solution,” she said.  “We also were looking at — instead of custom development, which is what we had — going to more of a low-code, no-code solution that gives us more pre-built capability. “

After evaluating a series of vendor demonstrations and completing the acquisition process, FERC’s IT team selected Microsoft’s Power Platform, a set of low-code tools that help create and automate solutions, to modernize the applications.  After conducting an application rationalization review, FERC defined a phased approach to modernize its applications. The first phase, which is complete, developed a Virtual Agenda system that supports the Commission voting process on energy matters.  FERC is now in the second phase, migrating its workload management and hydro project systems.  All the modernized systems operate on Microsoft Azure Government Community Cloud (GCC) environments, according to Pfeifer.

Wholesale improvements 

The first phase of modernization efforts, which went live in August, has already led to improvements for FERC employees, according to Pfeifer.

“The biggest improvement areas were greater integration of the workflows within the new system,” she said. Right away, there was less rekeying of data and fewer manual errors. Another significant improvement was “the automated generation of fields or documents that had previously been done manually,” she explained.

The new layer of automated workflow tracking provides more comprehensive visibility into the status of FERC dockets and reviews, which eventually flow up for final decisions by FERC’s five-member board of commissioners. The new system has replaced and consolidated a separate set of Microsoft Sharepoint sites used by the chairman and the commissioners’ staff to track projects in circulation before coming up for Commission decisions.

Externally, as part of future phases, regulated entities will find it easier to submit filings and requests, said Pfeifer. She acknowledged there’s more work to be done to improve FERC’s customers’ overall user experience. However, the cloud-based applications are already improving the agency’s ability to maintain the application and analyze data associated with the Commission proceedings — and puts FERC in a stronger position to leverage AI, said Pfeifer.

Lessons learned

One of the key lessons that helped accelerate FERC’s modernization efforts, according to Pfeifer, was using the acquisition process differently.

“We used some more advanced acquisition techniques — we requested a demo, for instance, as well as did a ‘Tech Challenge’ — which allowed us to see not just a paper document in response to a proposal, but a demo of a solution. That allowed us to work with (different vendor’s teams) to see how they would work together.” The tech challenge also included a tech talent component on top of the demo, “where vendors had to change something (so we could) see how they would go about doing that, what experience they had and what the team was capable of configuring and delivering,” she said.

Another lesson she stressed was the importance of business process mapping and reengineering “so that we could help our customers (define) what they want the processes to do. How do they want the processes to improve? We wanted to model that technically, not model the old processes that they weren’t happy with.”

That would also help the IT team implement the modernization efforts in phases, which was essential to ensuring the transition process went smoothly and minimized disruption to FERC mission.

Added benefits

While measuring the impact of modernizing and migrating to cloud services isn’t always straightforward, Pfeifer sees a number of operational benefits.

“Just to keep track of the status of things requires a lot of side spreadsheets and reports that aren’t part of the actual workflow (and will be incorporated into the FERC workload processing). Having a more streamlined workflow process also allows the user base to understand the due dates and ensure they’re meeting them, which once required substantial effort from the program offices to do that within the existing applications,” she explained. 

The other area that I see a lot of benefit in is consistency in how things are defined and managed, and handled across the different offices within FERC,” which in turn, leads to greater accuracy for decision-making.

Finally, Pfeifer sees these back-end improvements laying the foundation for modernizing the agency’s front-end experience for the regulated entities that rely on FERC, in line with the administration’s executive order on transforming the federal customer experience and service delivery.

“Modernization is hard to achieve because you have to replicate the capabilities of the existing systems — and improve on those capabilities at the same time,” concluded Pfeifer. “That said, sometimes the technical solution is the easier part of the solution.”

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How cloud modernization helps FERC streamline its regulatory processes appeared first on FedScoop.

]]>
75709
How the U.S. Census Bureau leveraged cloud services to modernize security https://fedscoop.com/how-the-us-census-bureau-leveraged-cloud-services-to-modernize-security/ Thu, 12 Oct 2023 19:30:00 +0000 https://fedscoop.com/?p=73435 By transitioning to cloud-native software-as-a-service solutions, the U.S. Census Bureau redefined its approach to log management and laid new foundations for zero trust.

The post How the U.S. Census Bureau leveraged cloud services to modernize security appeared first on FedScoop.

]]>
The U.S. Census Bureau is perhaps best known for conducting the nation’s decennial census. Its primary mission, though, is to serve the American people by collecting and analyzing vital statistical data about the population and the economy to guide decision-makers and policymakers at all levels of government, including 90,000 state and local governments and virtually every industry.

It’s a lot of data — and by law, all of it must be kept confidential and protected. That keeps Beau Houser, the bureau’s chief information security officer, and his team of roughly 100 security specialists and developers focused not only on daily security threats but also on many projects to modernize the security of the bureau’s complex IT infrastructure.

When Houser joined the Census Bureau in the fall of 2019, following security stints at the Department of Homeland Security, the Centers for Medicare & Medicaid Services, and the U.S. Small Business Administration, he recognized several challenges faced by many federal agencies that needed immediate attention.

Among other concerns, improving and enhancing visibility into the bureau’s IT environment was needed to strengthen the ability to detect and respond to cybersecurity threats. The bureau also faces burdens with managing a large number of servers supporting enterprise log management, which requires extensive maintenance and resources. Additionally, the bureau’s security practices were centered primarily around compliance, which had become increasingly ineffective at protecting against new and rapidly evolving cyber threats.

Focusing on the challenge

While the Census Bureau had been actively migrating many IT operations to the cloud, Houser determined that one critical area to address was the need to “implement a different approach to enterprise audit and log management.”

U.S. Census Bureau Chief Information Security Officer Beau Houser.

Part of that was driven by new agency mandates issued in an August 2021 White House memo (M-21-31) outlining steps to establish a more mature log management system to detect, investigate and remediate cyber threats on-premises and across increasingly distributed third-party services. Prompted partly by the SolarWinds malware incident, the memo also required agencies to prepare to share incident information with other federal agencies to help the government respond to incidents more quickly.

Another factor was what Houser described in a recent interview as “a big data problem” involving multiple terabytes of data per day. Storing and analyzing that data required maintaining and patching roughly 50 aging servers dedicated to the enterprise logging service. “You’ve got logs coming from tens of thousands of devices — simultaneously feeding logs into a centralized repository. And we saw how critical it is for us to get that right to quickly recognize and respond when something bad happens.”  

Transformative solution

Houser knew the bureau needed a cloud-native enterprise logging solution aligned with its ongoing cloud migration strategy. Specifically, he sought a solution that met several critical criteria: It had to be flexible and scalable to manage and aggregate the massive amounts of log data generated by the Census Bureau’s operations during peak periods. It had to provide comprehensive visibility across the bureau’s entire IT environment. It needed to lower operating costs and complexity. Lastly, Houser wanted a software-as-a-service solution that reduced his team’s maintenance activities to allow more time to hunt potential threats proactively.

After a careful evaluation, the Census Bureau transitioned from an on-prem logging service to a cloud-native enterprise logging analytics solution, delivered and maintained as a service by one of the leading federal cloud and enterprise providers.

Improved outcomes

The transition, once complete, started paying dividends almost immediately, according to Houser, by providing:

  • Full integration – “From a log source standpoint, we’ve been able to aggregate all logs from the entire enterprise,” said Houser. That includes logs from on-prem devices, the bureau’s data center, and other cloud services. “So you’re talking about a cloud-to-cloud communication from that standpoint.”
  • Wider visibility – The transition provided a broader window on security data not just for security operations staff but also for operations and maintenance personnel who need this information for troubleshooting errors and communication bottlenecks. The security problems captured in the log files “are expansive,” he said, so it’s important that “there’s a lot of experts dealing with those problems and reviewing the logs to figure out exactly what’s going on. We’ve been able to improve our collaboration pretty significantly.”
  • Greater granularity – Adopting advanced cloud-native solutions increases zero-trust capabilities that “allow you to be very granular with [user] access. It’s helping tremendously,” said Houser.  “Before, if you could read something, you could copy it. Now what we’re seeing is broken down even further, where you can give someone read access and deny them access to copy it.”

Zero-trust implementation

That added granularity also helps the Census Bureau apply conditional or attribute-based access policies versus role-based ones. “More and more cloud service providers are beginning to build those capabilities into their cloud natively,” Houser explained.

“Once you’ve got your authentication and policy engine in the cloud, you can configure those policies to say, ‘You’ve got to have this login with multi-factor. You have to be on this specific device. And you have to be coming from this geographic location.’ So, you open up a whole new set of attributes that you can use for that login process. We’ve seen so many attacks where someone takes over an account, and then they run through a system. If you have the conditional access set up, the account alone won’t let you in.”

Another advantage of a cloud-based software-as-a-service that Houser’s team is now working to capitalize on is the ability to configure endpoint products centrally. “So if malware hits a laptop, we can configure the automation to say, ‘Automatically download the forensics package, automatically quarantine the box, automatically do this step, and that step.’  So, you can create logic related to the workflow that the analyst would typically do.”

Lessons learned

In addition to achieving greater security practices and lowering operating costs, Houser believes working with cloud-native solutions to support zero-trust will yield additional benefits.

“As we continue moving down this path, we’re going to be able to really improve the user experience,” on par with the experience consumers routinely encounter engaging with their bank. There’s a lot of flexibility with zero trust. It sounds rigid when you say zero trust, but it’s very flexible.”

Additionally, Houser sees a longer-term benefit in picking up the tempo of technology deployment.

“The vendors in this space are all very, very capable. But at the end of the day, our IT folks have to maintain whatever we set up.” The challenge organizations increasingly face is “there’s not enough IT expertise — and certainly not enough cyber expertise” to keep up with the pace of change.

Leveraging cloud-native software-as-a-service solutions helps address that and allows new capabilities to be implemented quickly. “We’re always seeing new functions and capabilities creep into the portals we use to access the data. Queries get more optimized, intelligence gets more streamlined and integrated, and you’re able to do more AI and machine learning type activities that allow your analysts to focus on higher-level analysis.”

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How the U.S. Census Bureau leveraged cloud services to modernize security appeared first on FedScoop.

]]>
73435