executive perspective Archives | FedScoop https://fedscoop.com/tag/executive-perspective/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Thu, 23 May 2024 21:16:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 executive perspective Archives | FedScoop https://fedscoop.com/tag/executive-perspective/ 32 32 Reframing data management for federal agencies https://fedscoop.com/reframing-data-management-for-federal-agencies/ Tue, 30 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77741 Norseman Defense Technologies CTO David Hoon explains why adopting an ‘event-driven’ data processing model offers a superior way to manage the growing amount of volume at the edge of agency networks.

The post Reframing data management for federal agencies appeared first on FedScoop.

]]>
David Hoon is the Chief Technology Officer at Norseman Defense Technologies.

In the ever-expanding realm of data management, federal agencies face a pressing need to rethink their approach to effectively processing, analyzing and leveraging vast amounts of data.

As enterprises generate growing volumes of data at the edge of their networks, they face an increasing disconnect: As much as 80% of the data lives at the edge of their enterprise. In comparison, as much as 80% of their computing takes place in the cloud.

That’s why chief information and data officials within federal agencies must recognize the necessity of adopting a different data processing model. One model gaining increasing attention involves moving more of an enterprise’s computing power to the edge of their network operations — and transitioning from a “transaction-driven” data processing model to an “event-driven” model.

Embrace an ‘Uber-like’ data processing model

Traditionally, data processing has been transaction-driven, where systems respond to individual requests or transactions. However, this model is proving increasingly inadequate in today’s fast-paced, data-intensive distributed environments.

In an event-driven architecture, applications respond to events or triggers, allowing for real-time processing and decision-making.

Uber provides a constructive example: Instead of requesting a car through a central dispatch office—triggering a transaction—a rider uses their Uber app to register a ride request. That request translates into an event notification. Uber’s application watches for and identifies such events continuously and notifies a multitude of available drivers simultaneously. The model results in connecting the most appropriate resource (the nearest available driver) to fulfill the request.

Similarly, an enterprise’s “event-driven” notification approach allows it to process more data events locally more quickly and cost-effectively.

Leverage Cloud-Native Data Streaming Platforms

One such solution making revolutionary headway in today’s data streaming era is Confluent Apache Kafka. Kafka is a cloud-native data streaming platform that facilitates the seamless movement of data between edge devices and the cloud. It enables agencies to manage data in real time, ensuring timely insights and actions based on evolving events. It also allows IT teams to capitalize on data streaming from the growing array of IOT sensors, mobile devices, and endpoint devices generating enterprise data.

Kafka’s capabilities extend beyond traditional transactional systems, allowing agencies to architect applications that are inherently event-driven. By adopting Kafka, agencies can also unlock new possibilities for data processing, analytics, and decision-making at scale.

Partner for success

Adopting this more modern approach requires looking at data and analytic flows differently. So, it helps to work with experienced companies like Norseman Defense Technologies, which has played a pivotal role in helping defense and civilian agencies craft the most appropriate implementation strategies. Norseman offers expertise, tools, and platforms to support agencies in their journey toward edge computing and event-driven architecture.

Norseman’s capabilities span from building proof of concepts to deploying production-ready solutions tailored to the unique needs of federal agencies. In addition, thanks to partnerships with major providers like HP, Intel and Microsoft, Norseman is well-equipped to empower agencies with cutting-edge technologies and best practices. For instance, Norseman has two HP Z Workstations utilizing Intel Xeon Processors and Microsoft 11 in our lab. These processors are purpose-built to process large amounts of data, including AI/ML/DL models.

Ultimately, by deploying more computing power at the edge of your networks, and adopting event-driven analytics architecture, agencies can make better decisions faster and unlock the full potential of their data assets, driving innovation, efficiency and mission success.

And by utilizing cloud-native data streaming platforms and the know-how of experienced industry partners, agencies can better position themselves to capitalize on modern data management practices as data and analytics operate increasingly at the edge. Now is the time for federal officials to embrace the future of data processing and lead with agility and foresight in an increasingly data-driven world.

Learn more about how Norseman Defense Technologies can help your agency embrace a more modern approach to data management at the edge with HP Z workstations using Intel processors and Microsoft 11. Z Workstations are purpose-built to process large amounts of data for AI and ML workloads. Visit Norseman.com or email info@norseman.com for our REAL INTELLIGENCE about ARTIFICIAL INTELLIGENCE.

The post Reframing data management for federal agencies appeared first on FedScoop.

]]>
77741
Streamlining aid delivery: Lessons from SBA’s digital modernization journey https://fedscoop.com/streamlining-aid-delivery-lessons-from-sbas-digital-modernization-journey/ Mon, 29 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77657 How the Small Business Administration’s pivot to a cloud-based CRM platform helped it navigate through the pandemic and transform its approach to customer service.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
America’s more than 32 million small businesses play an indispensable role in driving the U.S. economy. Small businesses account for 43.5% of gross domestic product, employ 61.7 million workers and generate payrolls topping $2.9 trillion, according to government data.

In March 2020, as COVID-19 emerged as a global threat, it became apparent that millions of small businesses were headed into economic peril. While White House officials and lawmakers moved with unusual speed to enact the Coronavirus Aid, Relief and Economic Security (CARES) Act, the task of administering financial aid to small businesses suddenly fell on the U.S. Small Business Administration (SBA).

Legacy challenge

As an independent cabinet agency with fewer than 3,000 employees, the SBA had, until then, managed small business loan and grant applications using an email-based processing and approval system involving shared mailboxes built on Microsoft Outlook. The agency’s outdated backend infrastructure had never been designed — and was ill-equipped — to handle the overwhelming volume of relief requests flooding in from all 50 states once the CARES Act was enacted. Inboxes and storage capacities hit their daily caps almost immediately. Customers started to receive “undeliverable” messages. And SBA employees were unable to keep up with the skyrocketing workloads.

Photo of Brian Quay, SBA Program Manager
Brian Quay, SBA Program Manager

SBA’s leadership quickly recognized what many other public and private sector organizations discovered at the onset of the pandemic — to remain effective in an environment of rapidly escalating and fast-changing needs, they needed to transition quickly from their existing operating systems and adopt a more modern and scalable digital solution that could meet their rapidly-changing needs.

Transformative solution

SBA officials turned to a cloud-based customer relationship management (CRM) platform, Microsoft Dynamics 365. The platform not only offered the scalability and customization the SBA needed but also allowed the SBA to implement a wide range of integrated features, including email automation, auto-routing, metrics recognition, storage optimization, spam prevention, app integration, and auditing capabilities.

More fundamentally, the shift to a modern CRM platform enabled the SBA to transition from a series of manual, labor-intensive processes to a more efficient, automated system that could quickly scale to the volume SBA needed.

Improved outcomes

Adopting a modern, cloud-based CRM platform not only helped SBA overcome a host of technology bottlenecks but also resulted in significant improvements in the SBA’s internal operations and customer service. The platform:

  • Centralized all customer interactions and attached documents into a single contact record, saving a significant amount of time previously spent verifying that all required documents had been received.
  • Categorized requests and automated routing, resulting in timelier responses and fewer requests left in limbo.
  • Reduced much of the manual work associated with evaluating requests and eliminated common processing errors, enhancing productivity.
  • Allowed SBA staff to more quickly triage cases and review work origins, notes, updates, and activities that had occurred across multiple teams for faster response.
  • Provided customers with an easier way to submit a standardized inquiry using a convenient web form on my.sba.gov, built on Dynamics 365 Portal, rather than typing out an email. Customers can also schedule appointments through a Microsoft Power Pages Portal (appointment.sba.gov), assigned to SBA staff and fulfilled within the Dynamics 365 Customer Service platform.

By making it easier to integrate apps and implement a knowledge base reference library, the modernization effort also allowed the SBA to consolidate information from various sources and streamline the decision-making process. That effort was further enhanced with the creation of a Tier 1 dashboard for individual users and a Tier 2 dashboard for team leads to track overall caseloads, giving SBA staff the ability to make data-driven decisions faster and adapt to changing circumstances.

Mission modernization

Moving to a scalable, cloud-based CRM platform helped the SBA rally quickly in response to the sudden flood of aid requests. It also catapulted the SBA’s ability to meet its broader mission of serving and supporting small businesses.

In particular, the new platform made it possible for the SBA to manage activities more effectively with — and gain deeper insights about — more than 11 million individuals in its contact list.

“We can come to the campaign tabs [on the dashboard] and see a list of all of the different campaigns that the SBA has created inside of the platform,” explained SBA Program Manager Brian Quay. The software allows SBA staff to roll up all the cases associated with a contact record and even view image files to validate what information has been provided. It also allows SBA staff to see the status and performance of various marketing campaigns and activities.

“We can see…the number of members that were on [a particular] marketing list, how many messages were successfully sent to them, and failures. This is something that has been a huge productivity gain for SBA [staff], who were previously mainly sending those emails out through Outlook without an ability to track success,” the official said. Altogether, the platform helped SBA create and send more than 50 million templated outreach email from February to September 2023.

Another dimension of the SBA’s customer service modernization is the implementation of Power BI dashboards natively embedded into Dynamics 365. This allows executives who aren’t trained to use Dynamics to still access the metrics it provides by leveraging Power BI on the web or their mobile devices.  

Within two and a half years, the SBA expanded the platform from four mailboxes to over 200 individual inboxes, used by close to 80 teams with an unprecedented volume of activity. According to recent estimates, the platform now tracks over 20 million cases to date and has resulted in operational cost savings of over $25 million.

Lessons learned

The SBA’s transition from an email-based tracking system to a cloud-based CRM platform yielded several valuable lessons for federal executives considering a similar transformation:

Firstly, the importance of scalability cannot be overstated. In a crisis situation, the ability to quickly scale up operations is crucial, and a flexible digital platform can make all the difference.

Secondly, customization matters. Tailoring the system to the agency’s unique needs ensures maximum efficiency and usability.

Thirdly, integration capabilities are a game-changer. The ability to connect different tools and data sources creates a unified ecosystem, enabling faster decision-making.

Lastly, automation is a key enabler of efficiency. By automating routine tasks, agencies can focus their efforts on high-impact activities and respond swiftly to emerging challenges.

The Small Business Administration’s journey to digital modernization also demonstrates that in a rapidly evolving world, embracing innovative solutions is not just an option; it’s a necessity to empower organizations to thrive, grow, and support those they serve.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
77657
Keeping public sector data private and compliant with AI https://fedscoop.com/keeping-public-sector-data-private-and-compliant-with-ai/ Thu, 18 Apr 2024 23:55:34 +0000 https://fedscoop.com/?p=77376 Leaders from the United Nations, Google and industry illuminate how Google Workspace and Gemini help ensure data privacy and uphold data security.

The post Keeping public sector data private and compliant with AI appeared first on FedScoop.

]]>
Public sector and commercial enterprises are ingesting ever-growing amounts of data into their enterprise operations. That’s placing greater demands on enterprise IT executives to ensure the requisite data privacy and security controls are in place and functioning effectively.

At the same time, executives are also being asked to integrate smarter tools into their operations to help their employees work more productively. 

At  Google Cloud Next ’24, Google Cloud experts Ganesh Chilakapati, director of product management and Luke Camery, group product manager, were joined by executives from the United Nations Population Fund (UNFPA), UK energy retailer OVO and Air Liquide, a global industrial gases supplier, to discuss how Google Cloud’s generative AI capabilities are helping to achieve those objectives.

How Gemini safeguards your data 

Chilakapati and Camery demonstrated some of Gemini’s and Google Workspace’s signature capabilities, emphasizing features such as client-side encryption and comprehensive security frameworks. They also explained what happens to data inside Gemini.

“What is Gemini doing with all this data? How is it providing these customized and targeted responses that are so helpful? Is it learning and training on all of my enterprise data? No, it’s not. All of the privacy commitments we’ve made over the many decades to Google Workspace customers remain true,” said Chilakapati.

“Your data is your data and strictly stays within the workspace data boundary. Your privacy is protected, your content is not used for any other customers, and all of your existing data protections are automatically applied,” he added.

Your data, your trust boundary, managed by you

“Everything happens within your Google Workspace trust boundary. That means you have the ability to control whether or not Gemini stores not only the user prompts but also the generated responses. It’s completely up to you,” added Camery.

“One of the things we’re most excited to announce is the general availability of AI classification for Google Drive. This is a privacy-preserving customer-specific model that you have the option to train on your own specific corpus using your unique data class taxonomy,” said Camery.  “Leveraging AI classification and the guarantees that we’ve built into Gemini itself, you can have a virtuous cycle where you are leveraging AI while protecting your organization from emerging threats.”

Unparalleled Security: 5 key takeaways

Chilakapati and Camery stressed how the platform is designed to offer unparalleled security, built on the robust foundations of Google’s secure cloud infrastructure:

·  Enterprise terms of operation: Gemini operates strictly under processor enterprise terms, even when fetching the latest information from the internet, not on consumer controller terms.

·  Client-side encryption extension: Enterprises that have traditionally leveraged client-side encryption capabilities, ensuring that sensitive data remains inaccessible, can extend that one step further to protect against access attempts by any unauthorized entity, including other generative AI models.

·  Foundation on secure cloud infrastructure: Gemini is constructed on Google’s secure cloud platform, providing a solid foundation to enhance the overall security posture.

·  Zero-trust architecture: Zero-trust protocols are built in, not bolted on, not just on Google Cloud’s foundation but all the way up the stack to Gemini itself.

·  Sovereign controls integration: Gemini is also seamlessly integrated into an enterprise’s sovereign controls for Google Workspace, ensuring the integrity of data’s digital sovereignty journey, regardless of wherever you are in the world.

How Gemini AI is boosting productivity for the global workforce

Those features are especially important to customers like Soren Thomassen, director of IT solutions at UNFPA, which operates in 150 countries. Thomassen initially started using Gemini in May of 2023 to make chat functionality available to the fund’s entire user base. He began piloting Gemini Workspace last November.

“As an agency, safety and privacy is paramount. That’s why we were quick at rolling out the Gemini Chatbot because it’s covered by the same rules and the same controls as with Workspace.”

How Gemini AI is boosting productivity for the global workforce

Thomassen also pointed out how Gemini AI is helping UNFPA’s global workforce work more productively.

“Our users have been using it as a superpower writing assistant,” he said. Project managers spend a lot of time writing proposals.  “Instead of starting out with a blank screen…they can at least have a zero-draft that they can start working with. But the feedback that’s on my heart the most was when I hear those who have English as a second language say that Gemini helps them get their ideas across a little bit more clearly. Gemini (helps) everybody write English perfectly. And I think that’s important for a global organization.”

Jeremy Gibbons, Air Liquide’s digital and IT CTO, and Simon Goldsmith, OVO’s enterprise security and platforms lead, echoed Thomassen’s testament to Gemini’s utility. Each attested how the strategic deployment of Gemini within their organizations helped bolster productivity and ensure security. A recurrent theme throughout their conversation was the transformative potential of AI in reimagining work securely.

“I like to think of Workspace as kind of a walled garden of Eden,” said Goldsmith. “We want to give our people a really amazing experience in that garden… and allow them to experiment. But at the same time, within that safe environment, Workspace gives us the ability to, at an enterprise level, do the sensitive detective and corrective control work.”

Learn more about how Google Public Sector can help your organization “Kickstart your generative AI journey.”

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post Keeping public sector data private and compliant with AI appeared first on FedScoop.

]]>
77376
How DOD and Google Public Sector partnered using AI to fight cancer https://fedscoop.com/how-dod-and-google-public-sector-partnered-using-ai-to-fight-cancer/ Wed, 17 Apr 2024 22:00:00 +0000 https://fedscoop.com/?p=77304 With a goal to help pathologists more accurately diagnose cancer, the Department of Defense and Google Public Sector came together to build an augmented reality microscope.

The post How DOD and Google Public Sector partnered using AI to fight cancer appeared first on FedScoop.

]]>
Approximately $1.7 billion of the Department of Defense’s annual budget is spent on cancer as part of a broader effort to improve military health care for more than 9 million eligible beneficiaries. As healthcare professionals and researchers continue to look for ways to detect better, diagnose and treat cancer, AI has emerged as a formidable ally.

One groundbreaking development in pathology and cancer detection is the augmented reality microscope (ARM). During a session at Google Cloud Next ’24, experts discussed how the ARM is poised to revolutionize cancer diagnosis. The initiative is a collaboration between the Departments of Defense and Veterans Affairs (VA), DOD’s Defense Innovation Unit, Google Public Sector and Jenoptik.

The AI-assisted microscope provides not only a view of how AI is increasing the diagnostic accuracy and efficiency of cancer detection but also its ability to operate on edge devices to support medical and other professionals. That allows those professionals to operate locally, independent of internet or cloud connectivity. That’s becoming increasingly critical as the number of experienced healthcare specialists qualified to perform diagnostic evaluations is declining in the U.S.

ARM’s impact also extends beyond individual diagnoses. By digitizing tissue samples and harnessing the power of AI, the microscope eliminates geographical barriers, ensuring that patients everywhere have access to the expertise of top-tier pathologists.

A look at the development process

The genesis of the ARM lies in the recognition of a critical challenge faced by pathologists — the meticulous task of analyzing tissue slides, often numbering in the hundreds, to detect cancerous abnormalities. While traditional microscopes are indispensable, they present inherent limitations in terms of efficiency and accuracy, which are compounded by the sheer volume of data pathologists need to process.

The ARM integrates artificial intelligence (AI) into the diagnostic process. At its core, this device leverages AI algorithms deployed on the edge to analyze digitized tissue samples in real time. This transformative approach enables pathologists to identify potential abnormalities with unprecedented speed and precision, significantly enhancing diagnostic accuracy.

“The job of pathologists is to make sure that what we do is very accurate and that we can identify the disease. We don’t want to make a mistake,” said Dr. Nadeem Zafar, director of pathology and laboratory medicine service at Veterans Affairs Puget Sound. “This is where the technology comes in, and this is why we are so excited about it.”

The development process of the (ARM) also illustrates the power of collaboration. “Here at Google… we don’t just want to incrementally improve things like cancer diagnosis; we want to do it at scale,” said Scott Frohman, head of defense programs for Google Cloud. “And this project enabled us to think and connect and do something good for humanity.”

Current and future impacts

Central to the microscope’s functionality is its ability to highlight areas of interest detected by AI algorithms, providing pathologists with guidance during the diagnostic process. In addition, combining AI-driven insights with human expertise will empower healthcare professionals to make more informed decisions with greater confidence.

“Why I’m so excited about this technology is that it will bring so many experts to your desktop — while in the workflow, while in the flow of time,” Dr. Zafar said. “This is not something you have to learn. As long as you have the software… it will start giving you the heatmap and help detect cancer. So this is brilliant.”

In addition, this endeavor’s success underscores the pivotal role of public-private partnerships in driving innovation and advancing healthcare. Through concerted efforts and a shared vision, stakeholders across government, industry, and academia have made the ARM a reality, with tangible benefits for patients and healthcare providers alike.

“We know that we can’t solve these kinds of problems alone. So the partnership that we have with the government has been fantastic for bringing the subject matter expertise, the data, and the commitment to solving this problem with us,” said Frohman. “And it helps us to do the mission that we have at Google — making information available and accessible during cancer and making the human condition better every day.”

Thanks to AI and edge computing, the ARM promises to redefine the standard of care in pathology, offering new hope in the relentless pursuit of a cancer-free future.

Learn more about how Google Public Sector can help your organization “Kickstart your generative AI journey.”

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post How DOD and Google Public Sector partnered using AI to fight cancer appeared first on FedScoop.

]]>
77304
Breaking silos worldwide; how Google Cloud is fueling public sector AI, collaboration and innovation https://fedscoop.com/breaking-silos-worldwide-how-google-cloud-is-fueling-public-sector-ai-collaboration-and-innovation/ Tue, 16 Apr 2024 21:50:00 +0000 https://fedscoop.com/?p=77269 Leaders from the United Nations Population Fund, United Nations Office for Project Services and the World Bank share how they leverage Google Workspace and AI to empower collaboration, boost efficiency and drive transformative change.

The post Breaking silos worldwide; how Google Cloud is fueling public sector AI, collaboration and innovation appeared first on FedScoop.

]]>
In an era when technology rapidly reshapes landscapes, public sector agencies are increasingly turning to Google Workspace and AI to transform their operations. These tools enhance efficiency and change how governmental bodies collaborate, access information and serve the public.

During a session on public sector innovation at Google Cloud Next ’24, leaders from the World Bank, the United Nations Population Fund, and the United Nations Office for Project Services shared strategies for boosting productivity by fostering operational consistency and shared understanding. They also stressed the importance of transparency and continuous feedback and focused on the tangible benefits of eased workloads and enhanced efficiencies.

Enhancing collaboration

Public sector agencies operating worldwide are adopting cloud-based collaboration tools to create a more integrated work environment where documents and projects are easily accessible anytime and on any device. This shift not only boosts productivity but also enhances the flexibility of working environments by giving employees more universal access to agency resources and reports from offices around the world. It also helps overcome language barriers by providing translation services. And it helps ensure communications with international partners meet agency standards across different time zones and geographies. 

Justin Waugh, head of platforms, ITG enterprise platforms team at UNOPS, highlighted the transformative impact of Google Workspace in managing extensive infrastructure projects involving frequent account and project turnover. By leveraging Workspace tools like Google Docs and Sheets, UNOPS has streamlined project management and data handling and significantly reduced operational friction while enhancing user experience.

“The key thing to remember is to reduce friction for people using the systems that we’ve got, and we’ve been heavily into the book to do that,” said Waugh.

Waugh’s comments underscored the importance of integrating various Google products within organizational applications. Doing so facilitates more seamless project communication, budgeting and reporting. This strategic integration has allowed UNOPS to maintain standardized procedures across projects, fostering consistency and understanding throughout the organization.

Leveraging AI enterprise search solutions for efficient and confident information access

One of the standout applications of AI within the public sector is improving information access through enterprise search solutions. AI-powered search tools within Google Workspace can easily access vast amounts of data to find relevant documents, emails, and files. This capability is particularly transformative for government agencies, where quickly retrieving and correlating information can influence policy-making and public service delivery.

Raman Pugalumperumal, senior IT officer and lead for AI and ML platforms at the World Bank, discussed how Vertex AI and Google Cloud Search have revolutionized their data management practices. The World Bank, which manages extensive financial and economic analysis datasets, has benefited from the enhanced speed and accuracy these tools provide.

“We can measure things with quantitative information… we’re able to do [certain things] faster, or maybe things which we weren’t able to do — they’re able to do it because of the volume process,” said Pugalumperumal.

Pugalumperumal explained how AI is being used to quicken information retrieval, creating a more responsive and productive environment. This shift towards leveraging AI in its operations has unlocked new avenues for global access and sharing the World Bank’s wealth of knowledge, positioning AI as a pivotal asset in its mission to distribute developmental knowledge.

At UNFPA, IT Director Nayanesh Bhandutia said they’re working on developing an AI-powered search experience product. “We aim to break the data silos. We don’t want our users to worry about the data source when they’re looking for something,” said Bhadutia.

“This will be very time-saving because now the global population is not going through the pain of finding information.”

Maintaining the flow of multilingual work with AI-assisted translation 

Another significant advantage of integrating AI with Google Workspace in the public sector is overcoming language barriers. AI-driven language translation tools embedded within Google Workspace allow government employees from different linguistic backgrounds to collaborate effectively.

At UNFPA, IT Director Nayanesh Bhandutia highlighted the transformative role of the Gemini AI interface within Google Workspace. Introduced initially to simplify operations, Gemini has evolved to solve more complex challenges, particularly in multilingual settings. The AI-driven tool has been instrumental in helping staff draft clear and concise communications in English, Spanish, French, and Arabic.

“The introduction of Gemini has solved the [fluency] problem. Our users are getting more confident, and they’re spending less time making revisions, but we want to take it to the next level. We noticed many potentials,” said Bhandutia.

The potential for AI to extend beyond basic translations to fully integrated document management systems is vast. Bhandutia shared ambitious plans to leverage Gemini AI to automate the generation of critical documents, such as requests for proposals and job descriptions, which would reduce administrative overhead and enhance responsiveness.

For example, teams can use AI to translate documents and emails directly within the Google Workspace environment when collaborating on international aid programs or global policy initiatives. This seamless integration of translation services helps maintain the flow of work without the interruptions typically caused by language differences, fostering stronger connections and more cohesive teamwork.

“It is a fantastic stepping stone in the technology sector — [the capability] to deliver what people need…this is an excellent step towards accessibility,” said Waugh.

The future of AI and public sector innovation

The ongoing advancements in AI are expected to introduce more sophisticated tools for predictive analytics, supporting complex decision-making and personalized public services. These developments will not only drive greater efficiency within agencies but also enhance the quality of services provided to the public.

By leveraging these tools, government agencies are enhancing their operational capabilities and setting new standards for accessibility, efficiency, and collaboration in public service.

The post Breaking silos worldwide; how Google Cloud is fueling public sector AI, collaboration and innovation appeared first on FedScoop.

]]>
77269
Reimagining search: How AI and Google Search turbocharges patent examinations at USPTO  https://fedscoop.com/reimagining-search-how-ai-and-google-search-turbocharges-patent-examinations-at-uspto/ Tue, 16 Apr 2024 18:00:00 +0000 https://fedscoop.com/?p=77277 U.S. Patent and Trademark Office examiners needed a new approach to sifting through mountains of supporting evidence. Leaders from USPTO, Google and Accenture Federal Services leaders discuss how AI and Google Search are solving the challenge

The post Reimagining search: How AI and Google Search turbocharges patent examinations at USPTO  appeared first on FedScoop.

]]>
One of the many challenges government agencies and their employees face is finding the information they need, when they need it, and having confidence the information is correct, up to date and they haven’t missed essential data.

While advances in search technology have provided government employees with more powerful search tools, the dramatic growth of multi-modal data in all its forms has made search, and the ability to find the right information in petabytes of datasets more challenging than ever.

That was the challenge the United States Patent and Trademark Office and its patent examiners were facing, setting the stage for taking a new approach to search, enabled by advanced AI technologies. 

In a strategic partnership with Accenture Federal Services and Google Cloud, USPTO has developed and implemented a comprehensive system to refine its search mechanisms. This initiative has been about upgrading the traditional examination protocols, providing examiners with new, swift and precise search capabilities that respond to the complexity and scale of modern innovations.

At Google Cloud Next ’24, Jonathan Horner, supervisory patent IT specialist at the U.S. Patent and Trademark Office and Ian Wetherbee, software engineer at Google, joined Anna Hoffman, USPTO lead at Accenture Federal Services, on stage to discuss the agency’s ambitious efforts to leverage AI.  

The USPTO’s initiative highlights a broader challenge public agencies face in reviewing mountains of documents, artifacts and existing application decisions—and identifying where else in the world similar work may be underway and what’s verifiable. Until recently, Generative AI platforms had limited ability to provide grounded or verifiably sourced content in real time from the Internet.

“One of the things we recently announced is Grounding with Google Search,” said Katharyn White, head of public sector marketing for Google Cloud in a podcast from Google Cloud Next.

“Grounded means we know the source that the AI is using to come up with the answer. It’s grounded in a data source that you can check and ensure that is giving you the results that you want. And we’re making that easier.” 

For the USPTO, the need for advanced search capabilities meant first tackling its internal data retrieval and analytics capabilities.

Horner detailed the constitutional roots of patent law and the monumental task of examining each application against all human knowledge. “That’s a lot of information to go through… You’re looking for a needle in a stack of other needles,” Horner said, explaining the enormity of their challenge.

Traditionally, patent examiners relied on Boolean search techniques. However, with the exponential increase in information, these tools became increasingly inadequate for maintaining the high standards required for U.S. patents, said Horner. To address this, the USPTO has turned to AI, deploying tools in production that are not only efficient but also explainable, respecting the office’s duty to the public and applicants.

Hoffman discussed the journey starting in 2019 with a small prototype aimed to demonstrate that AI could meet these challenges. She mentioned conducting dozens of interviews and workshops, deploying a modern Google infrastructure and launching a prototype within three months—a pace unheard of in federal government operations. The prototype focused on finding prior art — evidence that an invention might already exist — that examiners might likely have missed otherwise.  The pilot paved the way for production features like showing “more like this” documents, enabling examiners to find similar documents more effectively.

“This feature became used by examiners immediately, which allowed us to run with a much bigger and more robust AI user interface directly similar to the examiner search system called Similarity Search,” Hoffman added. 

Google’s Wetherbee emphasized the necessity of “supporting the full Boolean search syntax as the model’s input.” A robust data collection process involved over a million data points from human raters and a pattern corpus of over 170 million documents. 

“There are hundreds of millions of citations inside patterns. It’s a huge corpus of over two terabytes of text content…We were able to process all of this human-rated data and the pattern data using Google infrastructure and turn that into training data to train our models,” said Wetherbee. 

Horner reiterated that despite technological advancements, the examiner is “still in the driver’s seat. All of these tools are based on an examiner’s ability to guide the AI towards what it is looking for, and that’s very important to us.” It’s a symbiotic relationship where AI extends the reach of human capability rather than replacing it.

Adopting these AI tools signifies a broader shift within the federal landscape—embracing cutting-edge technology to ensure accuracy and efficiency in governmental functions. It also poses an example for other federal agencies that are considering a similar path toward digital transformation.

Learn more about how Google Public Sector can help your organization “Kickstart your generative AI journey.”

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post Reimagining search: How AI and Google Search turbocharges patent examinations at USPTO  appeared first on FedScoop.

]]>
77277
How Google Cloud AI and Assured Workloads can enhance public sector security, compliance and service delivery at scale https://fedscoop.com/how-google-cloud-ai-and-assured-workloads-can-enhance-public-sector-security-compliance-and-service-delivery-at-scale/ Mon, 15 Apr 2024 22:00:00 +0000 https://fedscoop.com/?p=77239 Google Cloud’s expanding AI capabilities empower government agencies to better manage complex security, regulatory and data privacy challenges.

The post How Google Cloud AI and Assured Workloads can enhance public sector security, compliance and service delivery at scale appeared first on FedScoop.

]]>
The public sector’s IT modernization journey into the cloud is taking a new and revolutionary turn as agency leaders grapple with how to harness AI’s power to help them securely manage the volume and velocity of their workloads.

One challenge that remains at the forefront of those efforts is ensuring that today’s increasingly dynamic and distributed IT environments continue to meet the government’s complex security, regulatory and data privacy compliance rules — while learning how best to capitalize on AI’s potential to serve the public.

Google Cloud’s understanding and recognition of those challenges was widely reflected in a series of sweeping announcements at last week’s Google Cloud Next ’24, that promise new levels of security, flexibility and AI-assisted capabilities to Google Cloud’s public sector customers.

Building AI capabilities within protected workspaces

When it comes to securely managing public sector data, agencies using Google Cloud gain immediate benefits by building on top of its foundational architecture. Because the architecture was built for the cloud and also incorporates a substantial portion of federal security controls, it’s possible to demonstrate security compliance and obtain operating authority in weeks instead of months when folding in applications like Workspace or AI models like Gemini.

Another way agencies can enhance the security of their workloads is by using the Google Cloud Assured Workloads, which also have foundational government security compliance assurances built in, according to a panel of technology experts speaking at Google Cloud Next ’24.

The panelists, representing NASA, Palo Alto Networks, SAP and Google Cloud, argued that using zero-trust and compliance-as-a-code technologies has become essential to creating and maintaining easily reproducible compliant workload environments. That’s in part because of the diversity of government agency compliance requirements, from FedRAMP to the Department of Defense Impact Level 2, 4, and 5 security controls. 

By deploying workloads in pre-certified, software-defined environments set up to limit activity to compliant products and restrict where data can flow and who can access it, agencies can better ensure their workloads meet government requirements.

“Moving to Assured GCP is not just an upgrade; it’s a transformational leap forward,” said Collin Estes, the CIO of MRI Technologies working at NASA.

He pointed to two benefits: The “ability to generate compliant documentation as both a product of these large language models as well as helping us produce very well-structured definitions of what we’re doing, based on your actual implementations within Google Cloud. It is not a human saying, here’s what we do. It is us generating what we do from our environment. I think that’s going to really change the game in terms of how federal agencies manage risk across these portfolios.”

Among other benefits, the panelists pointed to:

Streamlining software development – Transitioning to Assured GCP allows government bodies to leverage and deploy cutting-edge technologies and methodologies, such as containerization and microservices, with unprecedented ease.

Focusing on the mission – By moving to Assured GCP, organizations can shift their focus from the backend to what truly matters—their mission. This shift represents not just an operational change but a philosophical one, where technology becomes an enabler rather than a hurdle in support of agency missions.

According to Palo Alto Networks Senior Manager Michael Clark, another reason for adopting Assured Workloads is the volume of data and the compute intensity with all this data. “We’re at that critical pivot point. We’ve been using this data to learn new threats and find zero-day threats so that we can enforce zero trust, improve security protection mechanisms, and map into new areas of innovation for threat detection and automated remediation.”

When building a compliant environment, SAP’s NVP Architecture and Product Launch, Hunter Downey, urged session attendees “to build it within a framework that I can ensure controls are in place, so I can rinse and repeat across 20 to 100 different teams, potentially touching 1,000 or 5,000 developers. If you start with the lowest common denominator, you’re going to fail. The reason why we partnered with GCP Assured Workloads is because you’re able to control the flow of information and messages. The minute the data goes global, it’s a different jurisdiction.”

Among other AI-related developments announced at Google Cloud Next ‘24:

  • Gemini for Google Cloud is a new generation of AI assistants for developers, Google Cloud services and applications that help users work and navigate security challenges more effectively.
  • See more announcements here. 

Learn more about how Google Public Sector can help your organization Kickstart your AI and security journey”.

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post How Google Cloud AI and Assured Workloads can enhance public sector security, compliance and service delivery at scale appeared first on FedScoop.

]]>
77239
How NIH’s National Library of Medicine is testing AI to match patients to clinical trials https://fedscoop.com/how-nihs-national-library-of-medicine-is-testing-ai-to-match-patients-to-clinical-trials/ Mon, 15 Apr 2024 20:56:03 +0000 https://fedscoop.com/?p=77082 A team at the National Institutes of Health’s National Library of Medicine is using large language models and AI to help researchers find candidates for clinical trials.

The post How NIH’s National Library of Medicine is testing AI to match patients to clinical trials appeared first on FedScoop.

]]>
Few organizations in the world do more to turn biomedical and behavioral research into better health than the National Institutes of Health, its 27 institutes and centers and more than 18,000 employees.

One of those institutes is the National Library of Medicine (NLM). Considered the NIH’s data hub, NLM’s 200-plus databases and systems serve billions of user sessions every day. From PubMed, the premier biomedical literature database, to resources like Genome  and ClinicalTrials.gov, NLM supports a diverse range of users, including researchers, clinicians, information professionals and the general public.

Photo of Dianne Babski, Director, User Services and Collection Division, NLM
Dianne Babski, Director, User Services and Collection Division, NLM

With so many users coming to its sites looking for a variety of information, NLM is always looking for new ways to enhance its products and services, according to Dianne Babski, Director of the User Services and Collection Division. NLM has been harnessing emerging technologies for many years but was quick to see how generative AI and large language models (LLMs) could potentially make its vast information resources more accessible to improve discovery.

Focus on innovation

“We’ve jumped into the GenAI [AI] arena ,” Babski said. “Luckily, we work in a very innovative institute, so staff were eager to play with these tools when they became accessible.” Through the Science and Technology Research Infrastructure for Discovery, Experimentation, and Sustainability (STRIDES) initiative, NIH researchers have access to leading cloud services and environments.

For her part, Babski is leading a six-month pilot project across NLM focused on 10 AI use cases using GenAI. The use cases are divided into five categories: product efficiency and usage, customer experience, data and code automation, workflow bias reduction, and research discovery.

NLM chart of 10 GenAI initiatives.
National Library of Medicine GenAI Initiatives (NLM)

The participating cloud service providers gave NIH access to a “firewalled, safe environment to play in, we’re not in an open web environment,” Babski explained. As part of this pilot program, NLM is also providing feedback on the user interface that it’s been creating for one of the provider’s government enterprise system.

Reducing recruitment challenges in clinical trials

One use case with potentially significant implications focuses on the work in ClinicalTrials.gov. Researchers, clinicians and patients use this NLM database to search for information about clinical research studies worldwide.

While clinical trials are pivotal for advancing medical knowledge and improving patient care, one of the most significant challenges in conducting them is patient recruitment. Identifying suitable candidates who meet specific study criteria is a time-consuming and resource-intensive process for researchers and clinicians, which can hamper the progress of medical research and delay the development of potentially lifesaving treatments.

Recognizing the need to streamline clinical trial matching, NLM created a prototype called TrialGPT. Using an innovative LLM framework, TrialGPT is designed to predict three elements of patient eligibility for clinical trials based on several criteria. It does so by processing information from patient notes to generate detailed explanations of eligibility, which are then aggregated to recommend appropriate clinical trials for patients.

Early results have demonstrated TrialGPT can accurately explain patient-criterion relevance and effectively rank and exclude candidates from clinical trials. However, two challenges were also noted, according to an agency brief: the model’s lack of intrinsic medical knowledge and its limited capacity for medical reasoning.

To address these challenges, the NLM project team plans to augment LLMs with specialized medical knowledge bases and domain-specific tools.

Babski said implementing TrialGPT has the potential to deliver a more efficient and accurate method for matching patients to trials. “While currently only available as a research prototype, we see its potential as a great resource for clinicians to help find patient participants for these different types of trials,” she said.

Lessons learned

As NLM continues to pioneer and experiment with AI-driven use cases like TrialGPT, Babski said several vital recommendations and lessons have emerged. “One of the biggest things I’ve taken away from this is that it’s way more work and complicated than you think it’s going to be,” she said.

For instance, there is a steep learning curve for people to get comfortable with these new tools. But at the same time, that process also allows participants to develop new technical skills, such as running Python code and working in notebook environments.

Effective collaboration and interdisciplinary teamwork are also essential. According to Babski, the pilot program has been successful because NLM was able to not only assemble a “dream team” of domain experts, data scientists, and engineers but also established a community across NIH—currently more than 500 people strong—that is energized and motivated to share their work and support one another. “Everyone has a interesting use case and they are rolling up their sleeves, and trying to figure out how to work with GenAI to solve real work problems,” she said.

Babski also follows a checklist of goals to be applied to any Generative AI pilot:

  • Experiment and develop best practices for LLMs in a safe (behind the firewall) “playground” environment.
  • Create a proof of concept that applies to the agency’s work.
  • Measure results to ensure utility and safety (e.g. NIST guidelines).
  • Develop workforce skills in generative AI.

For other agencies and organizations looking to explore the potential of AI technologies, Babski shared that it’s essential to embrace a culture of adaptability. “You have to be OK with pivoting halfway through,” she said. “We were trying to do data visualization work, and we just realized that this isn’t the right environment for what we were attempting, so we pivoted the use case.”

Ultimately, NLM’s use cases, including TrialGPT, highlight the transformative impact of GenAI and cloud-based platforms on healthcare innovation. By leveraging these technologies, NLM is likely to improve future healthcare delivery and patient outcomes globally.

Editor’s note: This piece was written by Scoop News Group’s content strategy division.

The post How NIH’s National Library of Medicine is testing AI to match patients to clinical trials appeared first on FedScoop.

]]>
77082
How cloud modernization transformed OPM cybersecurity operations https://fedscoop.com/how-cloud-modernization-transformed-opm-cybersecurity-operations/ Tue, 27 Feb 2024 20:27:00 +0000 https://fedscoop.com/?p=76126 By shifting to cloud-native solutions, the U.S. Office of Personnel Management has significantly enhanced its underlying security infrastructure to better protect the agency from evolving cyber threats.

The post How cloud modernization transformed OPM cybersecurity operations appeared first on FedScoop.

]]>
Few organizations in the world provide human resource services at the scale of the U.S. Office of Personnel Management (OPM). OPM oversees personnel management services for 2.2 million federal workers — and the retirement benefits for another 2.7 million annuitants, survivors, and family members. Because the agency also manages the federal workforce’s recruiting, hiring, and benefits management, OPM is responsible for handling vast amounts of sensitive data, making it a prime target for cyberattacks. 

Following a massive data breach in 2015, OPM instituted a comprehensive overhaul of its IT and security practices. However, in the years since, it became increasingly clear that without modernizing its underlying IT infrastructure, many of the remedies OPM put in place were becoming outmoded in the face of ever more sophisticated cyberattacks.

That was especially apparent to Guy Cavallo, who arrived at OPM in the fall of 2020 as principal deputy CIO after leading sweeping IT modernization initiatives at the Small Business Administration (SBA) and before that at the Transportation Security Administration (TSA). He was named OPM’s CIO in July 2021.

Recognizing new cyber challenges

“We looked at the on-premises cyber tools that OPM was running since the breach and saw while they were effective, with today’s advancements in AI and cyber capabilities, they weren’t keeping up with the attack vectors we’re facing today,” said Cavallo in a recent interview. Threat actors had shifted to identity-based attacks using more sophisticated tactics, requiring advanced detection and response solutions.

Guy Cavallo, CIO, OPM

“We knew with AI coming and the Executive Order on Cybersecurity requiring logging to get visibility into your environment, investing in on-premises hardware would be a never-ending battle of running out of storage space,” he concluded.

The cloud was “the ideal elastic storage case for that,” he continued. But it also offered other critical solutions. The cloud was the ideal way to host applications to ensure “that we’re always up to date on patching and versions, leaving that to the cloud vendors to take care of — something that the federal government struggles with,” he said.

Checklist for a better solution

Cavallo wanted to avoid the mistake he had seen other organizations make, trying to weave all kinds of tools into an enterprise security blanket. “It’s incredibly difficult to integrate them and not have them attack each other — or also not have gaps between them,” he said. “I’m a believer that simpler is much better than tying together best-of-breed from multiple vendors.”

James Saunders, CISO, OPM

That drove Cavallo and OPM Chief Information Security Officer James Saunders to pursue a fundamental shift to a cloud-native cybersecurity platform and “making that the heart of our security apparatus,” said Saunders.  

After reviewing the options, they elected to move to Microsoft’s Azure cloud-based cybersecurity stack “so that we can take advantage of the edge of cloud, and cloud in general, to collect data logs.” Additionally, it would mean “We didn’t have to worry about software patching and ‘Do I have enough disk space?’ It also allows us to springboard into more advanced capabilities such as artificial intelligence,” Saunders said.

Because OPM exchanges data with many federal agencies that rely on different data systems, Cavallo and Saunders also implemented a cloud access security broker (CASB) — a security policy enforcement engine that monitors and manages security activity across multiple domains from a single location. It also “enables our security analysts to be more efficient and identify threats in a more holistic manner,” Saunders explained.

Added benefits

“There is a general misconception that you can only use cloud tools from the host vendor to monitor and protect that environment.  We found that leveraging cyber defenses that span multiple clouds is a better solution for us instead of having multiple different tools performing the same function,” Cavallo added.

Microsoft’s extensive threat intelligence ecosystem and the ability to reduce the number of contracts OPM has to maintain were also critical factors in their decision to move to Azure, Saunders added.

The pay-off

The migration from on-premises infrastructure to the cloud was a complex process involving the retirement of more than 50 servers and the decommissioning of multiple storage areas and SQL databases, according to Saunders. The most challenging aspect, though, was not the technology but managing the transition with the workforce. Extensive training and organizational change management were as critical as the technical migration to the success of the transition.

According to Saunders, the benefits didn’t take long to recognize:

  • Enhanced visibility: OPM now has a more comprehensive view of its security posture, thanks to the centralized platform and increased log collection.
  • Improved threat detection and response: AI-powered tools and Microsoft’s threat intelligence helps OPM identify and respond to threats faster and more effectively.
  • Reduced costs and complexity: Cloud-native solutions eliminate the need for buying expensive on-premises hardware and software, while also simplifying management and maintenance.
  • Increased scalability and agility: The cloud platform allows OPM to easily scale its security infrastructure as needed to meet evolving threats and business requirements.

Collectively, those and related cloud benefits are also helping OPM make faster headway in meeting the administration’s zero-trust security goals.

Lessons learned

Perhaps one of the most important benefits is being able to demonstrate the magnitude and nature of today’s threat landscape to the agency’s leadership and how OPM is much better prepared to defend against it, according to Cavallo.

“When James and I showed them the visibility that we have from all those logs, it was a drop-the-mic moment for them. We can say we blocked 4,000 attacks in the last hour, but until you actually show them a world map and our adversaries trying to get into OPM, then be able to click and show the real details of it — those threats get lost in the noise,” he said.

“My recommendation at the CIO level is, this is a better mousetrap. But you can’t just expect people to flock to it. You have to go show them why it’s a better mousetrap.”

Among the other lessons Cavallo recommends to fellow IT leaders:

  • Focus on simplicity: Choose a single, integrated security platform to avoid the complexity of managing multiple tools.
  • Invest in training: Ensure your staff is trained and familiar with new cloud-native security tools and processes.
  • Start small and scale gradually: Begin with a pilot project and gradually migrate your security infrastructure to the cloud.
  • Communicate effectively: Clearly explain the benefits of cloud-native security to your stakeholders and address any concerns.

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How cloud modernization transformed OPM cybersecurity operations appeared first on FedScoop.

]]>
76126
How cloud modernization helps FERC streamline its regulatory processes https://fedscoop.com/how-cloud-modernization-helps-ferc-streamline-its-regulatory-processes/ Mon, 29 Jan 2024 20:30:00 +0000 https://fedscoop.com/?p=75709 A novel tech-challenge approach helped IT leaders at the Federal Energy Regulatory Commission start the overhaul of its legacy applications and improve customer service.

The post How cloud modernization helps FERC streamline its regulatory processes appeared first on FedScoop.

]]>
Upgrading and expanding the nation’s electrical grid isn’t just about meeting America’s growing electrical demands; it’s about powering our economy, securing national security, and ensuring a cleaner future for future generations. But because so much of that infrastructure lies mainly in private hands, orchestrating that effort requires extraordinary attention.

That is one of the roles of the Federal Energy Regulatory Commission (FERC), an independent agency within the Department of Energy that regulates the interstate transmission of electricity, natural gas, and oil. FERC also reviews and licenses liquified natural gas terminals, hydropower projects, and pipelines.

Ensuring that the companies building and operating power plants, pipelines and transmission lines adhere to safety standards, comply with environmental laws, and abide by market-based pricing guidelines requires an extensive review and approval process. And because FERC relies on approximately 1,570 employees to perform that work, technology plays a critical role in keeping on top of all those entities’ requests.

The challenge: Legacy technology

Michelle Pfeifer, Director of Solutions Delivery and Engineering, FERC.

FERC’s technology systems, however, like those at many federal agencies, have been hard-pressed to keep up with ongoing and emerging demands. Most of those systems used to manage the core applications the agency depends on are more than ten years old and stove-piped, according to Michelle Pfeifer, Director of Solutions Delivery and Engineering.

Among other challenges, the workload management systems used to process and manage filings from regulated entities operate on outdated, customized platforms, leading to inefficiencies in tracking and managing the significant number of filings the agency must handle, said Pfeifer, who joined FERC four years ago. “We have done some updates, but there are a significant number of requests for refresh or modernization that have not been addressed. Additionally, data had to be entered into multiple systems, compounding workload challenges,” she said.

The search for a better solution

FERC’s IT team recognized the solution required more than a technology refresh. So they decided to launch an “application layer modernization program to address pent-up demand from our customers, address the stovepipe nature of multiple applications, and do it more quickly and flexibly through an agile delivery process. And we definitely wanted a cloud-based solution,” she said.  “We also were looking at — instead of custom development, which is what we had — going to more of a low-code, no-code solution that gives us more pre-built capability. “

After evaluating a series of vendor demonstrations and completing the acquisition process, FERC’s IT team selected Microsoft’s Power Platform, a set of low-code tools that help create and automate solutions, to modernize the applications.  After conducting an application rationalization review, FERC defined a phased approach to modernize its applications. The first phase, which is complete, developed a Virtual Agenda system that supports the Commission voting process on energy matters.  FERC is now in the second phase, migrating its workload management and hydro project systems.  All the modernized systems operate on Microsoft Azure Government Community Cloud (GCC) environments, according to Pfeifer.

Wholesale improvements 

The first phase of modernization efforts, which went live in August, has already led to improvements for FERC employees, according to Pfeifer.

“The biggest improvement areas were greater integration of the workflows within the new system,” she said. Right away, there was less rekeying of data and fewer manual errors. Another significant improvement was “the automated generation of fields or documents that had previously been done manually,” she explained.

The new layer of automated workflow tracking provides more comprehensive visibility into the status of FERC dockets and reviews, which eventually flow up for final decisions by FERC’s five-member board of commissioners. The new system has replaced and consolidated a separate set of Microsoft Sharepoint sites used by the chairman and the commissioners’ staff to track projects in circulation before coming up for Commission decisions.

Externally, as part of future phases, regulated entities will find it easier to submit filings and requests, said Pfeifer. She acknowledged there’s more work to be done to improve FERC’s customers’ overall user experience. However, the cloud-based applications are already improving the agency’s ability to maintain the application and analyze data associated with the Commission proceedings — and puts FERC in a stronger position to leverage AI, said Pfeifer.

Lessons learned

One of the key lessons that helped accelerate FERC’s modernization efforts, according to Pfeifer, was using the acquisition process differently.

“We used some more advanced acquisition techniques — we requested a demo, for instance, as well as did a ‘Tech Challenge’ — which allowed us to see not just a paper document in response to a proposal, but a demo of a solution. That allowed us to work with (different vendor’s teams) to see how they would work together.” The tech challenge also included a tech talent component on top of the demo, “where vendors had to change something (so we could) see how they would go about doing that, what experience they had and what the team was capable of configuring and delivering,” she said.

Another lesson she stressed was the importance of business process mapping and reengineering “so that we could help our customers (define) what they want the processes to do. How do they want the processes to improve? We wanted to model that technically, not model the old processes that they weren’t happy with.”

That would also help the IT team implement the modernization efforts in phases, which was essential to ensuring the transition process went smoothly and minimized disruption to FERC mission.

Added benefits

While measuring the impact of modernizing and migrating to cloud services isn’t always straightforward, Pfeifer sees a number of operational benefits.

“Just to keep track of the status of things requires a lot of side spreadsheets and reports that aren’t part of the actual workflow (and will be incorporated into the FERC workload processing). Having a more streamlined workflow process also allows the user base to understand the due dates and ensure they’re meeting them, which once required substantial effort from the program offices to do that within the existing applications,” she explained. 

The other area that I see a lot of benefit in is consistency in how things are defined and managed, and handled across the different offices within FERC,” which in turn, leads to greater accuracy for decision-making.

Finally, Pfeifer sees these back-end improvements laying the foundation for modernizing the agency’s front-end experience for the regulated entities that rely on FERC, in line with the administration’s executive order on transforming the federal customer experience and service delivery.

“Modernization is hard to achieve because you have to replicate the capabilities of the existing systems — and improve on those capabilities at the same time,” concluded Pfeifer. “That said, sometimes the technical solution is the easier part of the solution.”

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How cloud modernization helps FERC streamline its regulatory processes appeared first on FedScoop.

]]>
75709