Insights Archives | FedScoop https://fedscoop.com/category/insights/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Thu, 23 May 2024 21:20:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Insights Archives | FedScoop https://fedscoop.com/category/insights/ 32 32 How the ‘third wave’ of AI is transforming government operations https://fedscoop.com/how-the-third-wave-of-ai-is-transforming-government-operations/ Thu, 23 May 2024 19:30:00 +0000 https://fedscoop.com/?p=78468 Microsoft’s William Chappell talks about the progression from ‘bespoke to foundational AI’ and how it drives scientific discovery and enhances cybersecurity across the government.

The post How the ‘third wave’ of AI is transforming government operations appeared first on FedScoop.

]]>
Over the years, numerous new technologies and tools have transformed the way government work is accomplished. But artificial intelligence is proving to be a real game-changer, revolutionizing the way agencies execute their missions.

During a presentation at AITalks in April 2024, Dr. William Chappell, VP and CTO of Microsoft’s Strategic Missions and Technologies Division, discussed the three waves of AI in government and what the evolution from bespoke to foundational AI means for agency operations.

The three waves of AI

According to Chappell, the first wave of AI in government was characterized by efforts to impart human expertise onto hardware, a painstaking process that relied heavily on scripted instructions.

Then, the advent of GPUs ushered in the second wave, accelerating AI’s capabilities beyond human levels in recognizing events and objects. “However, it was the underpinning hardware advancements that truly propelled AI forward, setting the stage for the third wave,” Chappell said.

The third wave signifies a significant shift toward contextual adaptation, where AI models possess a broad understanding rather than being tailored to specific applications. According to Chappell, this shift — predicted by the Defense Advanced Research Projects Agency in 2017 — marks a turning point in AI development. “No longer confined to bespoke models, AI now serves as a foundational tool with myriad applications across diverse domains,” he said.

What the third wave looks like for government

In this third wave, AI can perceive, learn, abstract and reason, paving the way for unprecedented advancements in government services. At the forefront of this transformation are foundational models like Microsoft’s Azure Quantum Elements, which enables scientists to accelerate the discovery of new materials with unparalleled speed and accuracy. By combining high-performance computing (HPC) with AI, researchers can sift through vast datasets and pinpoint promising candidates for further exploration.

Chappell pointed to a recent example where Microsoft’s Azure Quantum team joined forces with the Pacific Northwest National Laboratory to build a better battery by “developing new materials on a timescale that we haven’t seen before.”

In a matter of months, the group was able to screen more than 30 million potential materials digitally, narrow them down to 20 candidates, and then develop a revolutionary battery with 70% less lithium. “In the past, this would have taken years to develop,” Chappell said. “Examples like this will help change some of the biggest challenges that we have as a country.”

And the impact of AI in this third wave extends beyond scientific endeavors. Microsoft’s Planetary Computer, for instance, harnesses vast datasets from sources such as NASA, NOAA and the European Space Agency. Through natural language interactions, users can effortlessly navigate and glean insights from mountains of data, revolutionizing how information is accessed and utilized. While this demo was important for geospatial applications that make the data collected from satellites more valuable, this is representative of how any agency in the government can find answers from within the vast amount of data that they possess.

In addition, AI’s capabilities extend to cybersecurity, where it has become instrumental in identifying vulnerabilities in code that elude human detection. “That shift has happened over the last six months — and that is a very big deal for the U.S. government,” Chappell said. Initiatives like DARPA’s AI Cyber Challenge, which Microsoft supports, illustrate AI’s power in fortifying cyber defenses, offering a glimpse into a future where AI safeguards critical infrastructure.

As agencies navigate this new world of AI, Chappell says collaboration, experimentation, and ethical stewardship will be the guiding beacons, ensuring AI serves as a force for positive change in government operations. “The era of experimentation beckons, where users are empowered to shape AI’s trajectory and leverage its capabilities to fulfill their missions,” he said.

Learn more about how Microsoft can empower your organization with advanced AI.

This article was produced by Scoop News Group and sponsored by Microsoft.

The post How the ‘third wave’ of AI is transforming government operations appeared first on FedScoop.

]]>
78468
How the State Department used AI and machine learning to revolutionize records management https://fedscoop.com/how-the-state-department-used-ai-and-machine-learning-to-revolutionize-records-management/ Thu, 16 May 2024 19:34:00 +0000 https://fedscoop.com/?p=77770 A pilot approach helped the State Department streamline the document declassification process and improve the customer experience for FOIA requestors.

The post How the State Department used AI and machine learning to revolutionize records management appeared first on FedScoop.

]]>
In the digital age, government agencies are grappling with unprecedented volumes of data, presenting challenges in effectively managing, accessing and declassifying information.

The State Department is no exception. According to Eric Stein, deputy assistant secretary for the Office of Global Information Services, the department’s eRecords archive system currently contains more than 4 billion artifacts, which includes emails and cable traffic. “The latter is how we communicate to and from our embassies overseas,” Stein said.

Over time, however, department officials need to declare what can be released to the public and what stays classified — a time-consuming and labor-intensive process.

Photo of Eric Stein, U.S. State Department Eric Stein, deputy assistant secretary for the Office of Global Information Services,
Eric Stein, Deputy Assistant Secretary, Office of Global Information Services, U.S. Department of State

The State Department has turned to cutting-edge technologies like artificial intelligence (AI) and machine learning (ML) to find a more efficient solution. Through three pilot projects, the department has successfully streamlined the document review process for declassification and improved the customer experience when it comes to FOIA (Freedom of Information Act) requests.

An ML-driven declassification effort

At the root of the challenge is Executive Order 13526, which requires that classified records of permanent historical value be automatically declassified after 25 years unless a review determines an exemption. For the State Department, cables are among the most historically significant records produced by the agency. However, current processes and resource levels will not work for reviewing electronic records, including classified emails, created in the early 2000s and beyond, jeopardizing declassification reviews starting in 2025.

Recognizing the need for a more efficient process, the department embarked on a declassification review pilot using ML in October 2022. Stein came up with the pilot idea after participating in an AI Federal Leadership Program supported by major cloud providers, including Microsoft.

For the pilot, the department used cables from 1997 and created a review model based on human decisions from 2020 and 2021 concerning cables marked as confidential and secret in 1995 and 1996. The model uses discriminative AI to score and sort cables into three categories: those it was confident should be declassified, those it was confident shouldn’t be declassified, and those that needed manual review.

According to Stein, for the 1997 pilot group of more than 78,000 cables, the model performed the same as human reviewers 97% to 99% of the time and reduced staff hours by at least 60%.

“We project [this technology] will lead to millions of dollars in cost avoidance over the next several years because instead of asking for more money for human resources or different tools to help with this, we can use this technology,” Stein explained. “And then we can focus our human resources on the higher-level and analytical thinking and some of the tougher decisions, as opposed to what was a very manual process.”

Turning attention to FOIA

Building on the success of the declassification initiative, the State Department embarked on two other pilots to enhance the Freedom of Information Act (FOIA) processes from June 2023 to February 2024.

Like cable declassification efforts, handling a FOIA request is a highly manual process. According to Stein, sometimes those requests are a single sentence; others are multiple pages. But no matter the length, a staff member must acknowledge the request, advise whether the department will proceed with it, and then manually search for terms in those requests in different databases to locate the relevant information.

Using the lessons learned from the declassification pilot, Stein said State Department staff realized there was an opportunity to streamline certain parts of the FOIA process by simultaneously searching what was already in the department’s public reading room and in the record holdings.

“If that information is already publicly available, we can let the requester know right away,” Stein said. “And if not, if there are similar searches and reviews that have already been conducted by the agency, we can leverage those existing searches, which would result in a significant savings of staff hours and response time.”

Beyond internal operations, the State Department also sought to improve the customer experience for FOIA requesters by modernizing its public-facing website and search functionalities. Using AI-driven search algorithms and automated request processing, the department aims to “find and direct a customer to existing released documents” and “automate customer engagement early in the request process.”

Lessons learned

Since launching the first pilot in 2022, team members have learned several things. The first is to start small and provide the space and time to become familiar with the technology. “There are always demands and more work to be done, but to have the time to focus and learn is important,” Stein said.

Another lesson is the importance of collaboration. “It’s been helpful to talk across different communities to not only understand how this technology is beneficial but also what concerns are popping up—and discussing those sooner than later,” he said. “The sooner that anyone can start spending some time thinking about AI and machine learning critically, the better.”

Another lesson is to recognize the need to “continuously train a model because you can’t just do this once and then let it go. You have to constantly be reviewing how we’re training the model (in light of) world events and different things,” he said.

These pilots have also shown how this technology will allow State Department staff to better respond to other needs, including FOIA requests. For example, someone may ask for something in a certain way, but that’s not how it’s talked about internally.

“This technology allows us to say, ‘Well, they asked for this, but they may have also meant that,’” Stein said. “So, it allows us to make those connections, which may have been missing in the past.”

The State Department’s strategic adoption of AI and ML technologies in records management and transparency initiatives underscores the transformative potential of these tools. By starting small, fostering collaboration and prioritizing user-centric design, the department has paved the way for broader applications of AI and ML to support more efficient and transparent government operations.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.  To learn more about AI for government from Microsoft, sign up here to receive news and updates on how advanced AI can empower your organization.

The post How the State Department used AI and machine learning to revolutionize records management appeared first on FedScoop.

]]>
77770
Reframing data management for federal agencies https://fedscoop.com/reframing-data-management-for-federal-agencies/ Tue, 30 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77741 Norseman Defense Technologies CTO David Hoon explains why adopting an ‘event-driven’ data processing model offers a superior way to manage the growing amount of volume at the edge of agency networks.

The post Reframing data management for federal agencies appeared first on FedScoop.

]]>
David Hoon is the Chief Technology Officer at Norseman Defense Technologies.

In the ever-expanding realm of data management, federal agencies face a pressing need to rethink their approach to effectively processing, analyzing and leveraging vast amounts of data.

As enterprises generate growing volumes of data at the edge of their networks, they face an increasing disconnect: As much as 80% of the data lives at the edge of their enterprise. In comparison, as much as 80% of their computing takes place in the cloud.

That’s why chief information and data officials within federal agencies must recognize the necessity of adopting a different data processing model. One model gaining increasing attention involves moving more of an enterprise’s computing power to the edge of their network operations — and transitioning from a “transaction-driven” data processing model to an “event-driven” model.

Embrace an ‘Uber-like’ data processing model

Traditionally, data processing has been transaction-driven, where systems respond to individual requests or transactions. However, this model is proving increasingly inadequate in today’s fast-paced, data-intensive distributed environments.

In an event-driven architecture, applications respond to events or triggers, allowing for real-time processing and decision-making.

Uber provides a constructive example: Instead of requesting a car through a central dispatch office—triggering a transaction—a rider uses their Uber app to register a ride request. That request translates into an event notification. Uber’s application watches for and identifies such events continuously and notifies a multitude of available drivers simultaneously. The model results in connecting the most appropriate resource (the nearest available driver) to fulfill the request.

Similarly, an enterprise’s “event-driven” notification approach allows it to process more data events locally more quickly and cost-effectively.

Leverage Cloud-Native Data Streaming Platforms

One such solution making revolutionary headway in today’s data streaming era is Confluent Apache Kafka. Kafka is a cloud-native data streaming platform that facilitates the seamless movement of data between edge devices and the cloud. It enables agencies to manage data in real time, ensuring timely insights and actions based on evolving events. It also allows IT teams to capitalize on data streaming from the growing array of IOT sensors, mobile devices, and endpoint devices generating enterprise data.

Kafka’s capabilities extend beyond traditional transactional systems, allowing agencies to architect applications that are inherently event-driven. By adopting Kafka, agencies can also unlock new possibilities for data processing, analytics, and decision-making at scale.

Partner for success

Adopting this more modern approach requires looking at data and analytic flows differently. So, it helps to work with experienced companies like Norseman Defense Technologies, which has played a pivotal role in helping defense and civilian agencies craft the most appropriate implementation strategies. Norseman offers expertise, tools, and platforms to support agencies in their journey toward edge computing and event-driven architecture.

Norseman’s capabilities span from building proof of concepts to deploying production-ready solutions tailored to the unique needs of federal agencies. In addition, thanks to partnerships with major providers like HP, Intel and Microsoft, Norseman is well-equipped to empower agencies with cutting-edge technologies and best practices. For instance, Norseman has two HP Z Workstations utilizing Intel Xeon Processors and Microsoft 11 in our lab. These processors are purpose-built to process large amounts of data, including AI/ML/DL models.

Ultimately, by deploying more computing power at the edge of your networks, and adopting event-driven analytics architecture, agencies can make better decisions faster and unlock the full potential of their data assets, driving innovation, efficiency and mission success.

And by utilizing cloud-native data streaming platforms and the know-how of experienced industry partners, agencies can better position themselves to capitalize on modern data management practices as data and analytics operate increasingly at the edge. Now is the time for federal officials to embrace the future of data processing and lead with agility and foresight in an increasingly data-driven world.

Learn more about how Norseman Defense Technologies can help your agency embrace a more modern approach to data management at the edge with HP Z workstations using Intel processors and Microsoft 11. Z Workstations are purpose-built to process large amounts of data for AI and ML workloads. Visit Norseman.com or email info@norseman.com for our REAL INTELLIGENCE about ARTIFICIAL INTELLIGENCE.

The post Reframing data management for federal agencies appeared first on FedScoop.

]]>
77741
Streamlining aid delivery: Lessons from SBA’s digital modernization journey https://fedscoop.com/streamlining-aid-delivery-lessons-from-sbas-digital-modernization-journey/ Mon, 29 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77657 How the Small Business Administration’s pivot to a cloud-based CRM platform helped it navigate through the pandemic and transform its approach to customer service.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
America’s more than 32 million small businesses play an indispensable role in driving the U.S. economy. Small businesses account for 43.5% of gross domestic product, employ 61.7 million workers and generate payrolls topping $2.9 trillion, according to government data.

In March 2020, as COVID-19 emerged as a global threat, it became apparent that millions of small businesses were headed into economic peril. While White House officials and lawmakers moved with unusual speed to enact the Coronavirus Aid, Relief and Economic Security (CARES) Act, the task of administering financial aid to small businesses suddenly fell on the U.S. Small Business Administration (SBA).

Legacy challenge

As an independent cabinet agency with fewer than 3,000 employees, the SBA had, until then, managed small business loan and grant applications using an email-based processing and approval system involving shared mailboxes built on Microsoft Outlook. The agency’s outdated backend infrastructure had never been designed — and was ill-equipped — to handle the overwhelming volume of relief requests flooding in from all 50 states once the CARES Act was enacted. Inboxes and storage capacities hit their daily caps almost immediately. Customers started to receive “undeliverable” messages. And SBA employees were unable to keep up with the skyrocketing workloads.

Photo of Brian Quay, SBA Program Manager
Brian Quay, SBA Program Manager

SBA’s leadership quickly recognized what many other public and private sector organizations discovered at the onset of the pandemic — to remain effective in an environment of rapidly escalating and fast-changing needs, they needed to transition quickly from their existing operating systems and adopt a more modern and scalable digital solution that could meet their rapidly-changing needs.

Transformative solution

SBA officials turned to a cloud-based customer relationship management (CRM) platform, Microsoft Dynamics 365. The platform not only offered the scalability and customization the SBA needed but also allowed the SBA to implement a wide range of integrated features, including email automation, auto-routing, metrics recognition, storage optimization, spam prevention, app integration, and auditing capabilities.

More fundamentally, the shift to a modern CRM platform enabled the SBA to transition from a series of manual, labor-intensive processes to a more efficient, automated system that could quickly scale to the volume SBA needed.

Improved outcomes

Adopting a modern, cloud-based CRM platform not only helped SBA overcome a host of technology bottlenecks but also resulted in significant improvements in the SBA’s internal operations and customer service. The platform:

  • Centralized all customer interactions and attached documents into a single contact record, saving a significant amount of time previously spent verifying that all required documents had been received.
  • Categorized requests and automated routing, resulting in timelier responses and fewer requests left in limbo.
  • Reduced much of the manual work associated with evaluating requests and eliminated common processing errors, enhancing productivity.
  • Allowed SBA staff to more quickly triage cases and review work origins, notes, updates, and activities that had occurred across multiple teams for faster response.
  • Provided customers with an easier way to submit a standardized inquiry using a convenient web form on my.sba.gov, built on Dynamics 365 Portal, rather than typing out an email. Customers can also schedule appointments through a Microsoft Power Pages Portal (appointment.sba.gov), assigned to SBA staff and fulfilled within the Dynamics 365 Customer Service platform.

By making it easier to integrate apps and implement a knowledge base reference library, the modernization effort also allowed the SBA to consolidate information from various sources and streamline the decision-making process. That effort was further enhanced with the creation of a Tier 1 dashboard for individual users and a Tier 2 dashboard for team leads to track overall caseloads, giving SBA staff the ability to make data-driven decisions faster and adapt to changing circumstances.

Mission modernization

Moving to a scalable, cloud-based CRM platform helped the SBA rally quickly in response to the sudden flood of aid requests. It also catapulted the SBA’s ability to meet its broader mission of serving and supporting small businesses.

In particular, the new platform made it possible for the SBA to manage activities more effectively with — and gain deeper insights about — more than 11 million individuals in its contact list.

“We can come to the campaign tabs [on the dashboard] and see a list of all of the different campaigns that the SBA has created inside of the platform,” explained SBA Program Manager Brian Quay. The software allows SBA staff to roll up all the cases associated with a contact record and even view image files to validate what information has been provided. It also allows SBA staff to see the status and performance of various marketing campaigns and activities.

“We can see…the number of members that were on [a particular] marketing list, how many messages were successfully sent to them, and failures. This is something that has been a huge productivity gain for SBA [staff], who were previously mainly sending those emails out through Outlook without an ability to track success,” the official said. Altogether, the platform helped SBA create and send more than 50 million templated outreach email from February to September 2023.

Another dimension of the SBA’s customer service modernization is the implementation of Power BI dashboards natively embedded into Dynamics 365. This allows executives who aren’t trained to use Dynamics to still access the metrics it provides by leveraging Power BI on the web or their mobile devices.  

Within two and a half years, the SBA expanded the platform from four mailboxes to over 200 individual inboxes, used by close to 80 teams with an unprecedented volume of activity. According to recent estimates, the platform now tracks over 20 million cases to date and has resulted in operational cost savings of over $25 million.

Lessons learned

The SBA’s transition from an email-based tracking system to a cloud-based CRM platform yielded several valuable lessons for federal executives considering a similar transformation:

Firstly, the importance of scalability cannot be overstated. In a crisis situation, the ability to quickly scale up operations is crucial, and a flexible digital platform can make all the difference.

Secondly, customization matters. Tailoring the system to the agency’s unique needs ensures maximum efficiency and usability.

Thirdly, integration capabilities are a game-changer. The ability to connect different tools and data sources creates a unified ecosystem, enabling faster decision-making.

Lastly, automation is a key enabler of efficiency. By automating routine tasks, agencies can focus their efforts on high-impact activities and respond swiftly to emerging challenges.

The Small Business Administration’s journey to digital modernization also demonstrates that in a rapidly evolving world, embracing innovative solutions is not just an option; it’s a necessity to empower organizations to thrive, grow, and support those they serve.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
77657
Keeping public sector data private and compliant with AI https://fedscoop.com/keeping-public-sector-data-private-and-compliant-with-ai/ Thu, 18 Apr 2024 23:55:34 +0000 https://fedscoop.com/?p=77376 Leaders from the United Nations, Google and industry illuminate how Google Workspace and Gemini help ensure data privacy and uphold data security.

The post Keeping public sector data private and compliant with AI appeared first on FedScoop.

]]>
Public sector and commercial enterprises are ingesting ever-growing amounts of data into their enterprise operations. That’s placing greater demands on enterprise IT executives to ensure the requisite data privacy and security controls are in place and functioning effectively.

At the same time, executives are also being asked to integrate smarter tools into their operations to help their employees work more productively. 

At  Google Cloud Next ’24, Google Cloud experts Ganesh Chilakapati, director of product management and Luke Camery, group product manager, were joined by executives from the United Nations Population Fund (UNFPA), UK energy retailer OVO and Air Liquide, a global industrial gases supplier, to discuss how Google Cloud’s generative AI capabilities are helping to achieve those objectives.

How Gemini safeguards your data 

Chilakapati and Camery demonstrated some of Gemini’s and Google Workspace’s signature capabilities, emphasizing features such as client-side encryption and comprehensive security frameworks. They also explained what happens to data inside Gemini.

“What is Gemini doing with all this data? How is it providing these customized and targeted responses that are so helpful? Is it learning and training on all of my enterprise data? No, it’s not. All of the privacy commitments we’ve made over the many decades to Google Workspace customers remain true,” said Chilakapati.

“Your data is your data and strictly stays within the workspace data boundary. Your privacy is protected, your content is not used for any other customers, and all of your existing data protections are automatically applied,” he added.

Your data, your trust boundary, managed by you

“Everything happens within your Google Workspace trust boundary. That means you have the ability to control whether or not Gemini stores not only the user prompts but also the generated responses. It’s completely up to you,” added Camery.

“One of the things we’re most excited to announce is the general availability of AI classification for Google Drive. This is a privacy-preserving customer-specific model that you have the option to train on your own specific corpus using your unique data class taxonomy,” said Camery.  “Leveraging AI classification and the guarantees that we’ve built into Gemini itself, you can have a virtuous cycle where you are leveraging AI while protecting your organization from emerging threats.”

Unparalleled Security: 5 key takeaways

Chilakapati and Camery stressed how the platform is designed to offer unparalleled security, built on the robust foundations of Google’s secure cloud infrastructure:

·  Enterprise terms of operation: Gemini operates strictly under processor enterprise terms, even when fetching the latest information from the internet, not on consumer controller terms.

·  Client-side encryption extension: Enterprises that have traditionally leveraged client-side encryption capabilities, ensuring that sensitive data remains inaccessible, can extend that one step further to protect against access attempts by any unauthorized entity, including other generative AI models.

·  Foundation on secure cloud infrastructure: Gemini is constructed on Google’s secure cloud platform, providing a solid foundation to enhance the overall security posture.

·  Zero-trust architecture: Zero-trust protocols are built in, not bolted on, not just on Google Cloud’s foundation but all the way up the stack to Gemini itself.

·  Sovereign controls integration: Gemini is also seamlessly integrated into an enterprise’s sovereign controls for Google Workspace, ensuring the integrity of data’s digital sovereignty journey, regardless of wherever you are in the world.

How Gemini AI is boosting productivity for the global workforce

Those features are especially important to customers like Soren Thomassen, director of IT solutions at UNFPA, which operates in 150 countries. Thomassen initially started using Gemini in May of 2023 to make chat functionality available to the fund’s entire user base. He began piloting Gemini Workspace last November.

“As an agency, safety and privacy is paramount. That’s why we were quick at rolling out the Gemini Chatbot because it’s covered by the same rules and the same controls as with Workspace.”

How Gemini AI is boosting productivity for the global workforce

Thomassen also pointed out how Gemini AI is helping UNFPA’s global workforce work more productively.

“Our users have been using it as a superpower writing assistant,” he said. Project managers spend a lot of time writing proposals.  “Instead of starting out with a blank screen…they can at least have a zero-draft that they can start working with. But the feedback that’s on my heart the most was when I hear those who have English as a second language say that Gemini helps them get their ideas across a little bit more clearly. Gemini (helps) everybody write English perfectly. And I think that’s important for a global organization.”

Jeremy Gibbons, Air Liquide’s digital and IT CTO, and Simon Goldsmith, OVO’s enterprise security and platforms lead, echoed Thomassen’s testament to Gemini’s utility. Each attested how the strategic deployment of Gemini within their organizations helped bolster productivity and ensure security. A recurrent theme throughout their conversation was the transformative potential of AI in reimagining work securely.

“I like to think of Workspace as kind of a walled garden of Eden,” said Goldsmith. “We want to give our people a really amazing experience in that garden… and allow them to experiment. But at the same time, within that safe environment, Workspace gives us the ability to, at an enterprise level, do the sensitive detective and corrective control work.”

Learn more about how Google Public Sector can help your organization “Kickstart your generative AI journey.”

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post Keeping public sector data private and compliant with AI appeared first on FedScoop.

]]>
77376
How DOD and Google Public Sector partnered using AI to fight cancer https://fedscoop.com/how-dod-and-google-public-sector-partnered-using-ai-to-fight-cancer/ Wed, 17 Apr 2024 22:00:00 +0000 https://fedscoop.com/?p=77304 With a goal to help pathologists more accurately diagnose cancer, the Department of Defense and Google Public Sector came together to build an augmented reality microscope.

The post How DOD and Google Public Sector partnered using AI to fight cancer appeared first on FedScoop.

]]>
Approximately $1.7 billion of the Department of Defense’s annual budget is spent on cancer as part of a broader effort to improve military health care for more than 9 million eligible beneficiaries. As healthcare professionals and researchers continue to look for ways to detect better, diagnose and treat cancer, AI has emerged as a formidable ally.

One groundbreaking development in pathology and cancer detection is the augmented reality microscope (ARM). During a session at Google Cloud Next ’24, experts discussed how the ARM is poised to revolutionize cancer diagnosis. The initiative is a collaboration between the Departments of Defense and Veterans Affairs (VA), DOD’s Defense Innovation Unit, Google Public Sector and Jenoptik.

The AI-assisted microscope provides not only a view of how AI is increasing the diagnostic accuracy and efficiency of cancer detection but also its ability to operate on edge devices to support medical and other professionals. That allows those professionals to operate locally, independent of internet or cloud connectivity. That’s becoming increasingly critical as the number of experienced healthcare specialists qualified to perform diagnostic evaluations is declining in the U.S.

ARM’s impact also extends beyond individual diagnoses. By digitizing tissue samples and harnessing the power of AI, the microscope eliminates geographical barriers, ensuring that patients everywhere have access to the expertise of top-tier pathologists.

A look at the development process

The genesis of the ARM lies in the recognition of a critical challenge faced by pathologists — the meticulous task of analyzing tissue slides, often numbering in the hundreds, to detect cancerous abnormalities. While traditional microscopes are indispensable, they present inherent limitations in terms of efficiency and accuracy, which are compounded by the sheer volume of data pathologists need to process.

The ARM integrates artificial intelligence (AI) into the diagnostic process. At its core, this device leverages AI algorithms deployed on the edge to analyze digitized tissue samples in real time. This transformative approach enables pathologists to identify potential abnormalities with unprecedented speed and precision, significantly enhancing diagnostic accuracy.

“The job of pathologists is to make sure that what we do is very accurate and that we can identify the disease. We don’t want to make a mistake,” said Dr. Nadeem Zafar, director of pathology and laboratory medicine service at Veterans Affairs Puget Sound. “This is where the technology comes in, and this is why we are so excited about it.”

The development process of the (ARM) also illustrates the power of collaboration. “Here at Google… we don’t just want to incrementally improve things like cancer diagnosis; we want to do it at scale,” said Scott Frohman, head of defense programs for Google Cloud. “And this project enabled us to think and connect and do something good for humanity.”

Current and future impacts

Central to the microscope’s functionality is its ability to highlight areas of interest detected by AI algorithms, providing pathologists with guidance during the diagnostic process. In addition, combining AI-driven insights with human expertise will empower healthcare professionals to make more informed decisions with greater confidence.

“Why I’m so excited about this technology is that it will bring so many experts to your desktop — while in the workflow, while in the flow of time,” Dr. Zafar said. “This is not something you have to learn. As long as you have the software… it will start giving you the heatmap and help detect cancer. So this is brilliant.”

In addition, this endeavor’s success underscores the pivotal role of public-private partnerships in driving innovation and advancing healthcare. Through concerted efforts and a shared vision, stakeholders across government, industry, and academia have made the ARM a reality, with tangible benefits for patients and healthcare providers alike.

“We know that we can’t solve these kinds of problems alone. So the partnership that we have with the government has been fantastic for bringing the subject matter expertise, the data, and the commitment to solving this problem with us,” said Frohman. “And it helps us to do the mission that we have at Google — making information available and accessible during cancer and making the human condition better every day.”

Thanks to AI and edge computing, the ARM promises to redefine the standard of care in pathology, offering new hope in the relentless pursuit of a cancer-free future.

Learn more about how Google Public Sector can help your organization “Kickstart your generative AI journey.”

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post How DOD and Google Public Sector partnered using AI to fight cancer appeared first on FedScoop.

]]>
77304
Breaking silos worldwide; how Google Cloud is fueling public sector AI, collaboration and innovation https://fedscoop.com/breaking-silos-worldwide-how-google-cloud-is-fueling-public-sector-ai-collaboration-and-innovation/ Tue, 16 Apr 2024 21:50:00 +0000 https://fedscoop.com/?p=77269 Leaders from the United Nations Population Fund, United Nations Office for Project Services and the World Bank share how they leverage Google Workspace and AI to empower collaboration, boost efficiency and drive transformative change.

The post Breaking silos worldwide; how Google Cloud is fueling public sector AI, collaboration and innovation appeared first on FedScoop.

]]>
In an era when technology rapidly reshapes landscapes, public sector agencies are increasingly turning to Google Workspace and AI to transform their operations. These tools enhance efficiency and change how governmental bodies collaborate, access information and serve the public.

During a session on public sector innovation at Google Cloud Next ’24, leaders from the World Bank, the United Nations Population Fund, and the United Nations Office for Project Services shared strategies for boosting productivity by fostering operational consistency and shared understanding. They also stressed the importance of transparency and continuous feedback and focused on the tangible benefits of eased workloads and enhanced efficiencies.

Enhancing collaboration

Public sector agencies operating worldwide are adopting cloud-based collaboration tools to create a more integrated work environment where documents and projects are easily accessible anytime and on any device. This shift not only boosts productivity but also enhances the flexibility of working environments by giving employees more universal access to agency resources and reports from offices around the world. It also helps overcome language barriers by providing translation services. And it helps ensure communications with international partners meet agency standards across different time zones and geographies. 

Justin Waugh, head of platforms, ITG enterprise platforms team at UNOPS, highlighted the transformative impact of Google Workspace in managing extensive infrastructure projects involving frequent account and project turnover. By leveraging Workspace tools like Google Docs and Sheets, UNOPS has streamlined project management and data handling and significantly reduced operational friction while enhancing user experience.

“The key thing to remember is to reduce friction for people using the systems that we’ve got, and we’ve been heavily into the book to do that,” said Waugh.

Waugh’s comments underscored the importance of integrating various Google products within organizational applications. Doing so facilitates more seamless project communication, budgeting and reporting. This strategic integration has allowed UNOPS to maintain standardized procedures across projects, fostering consistency and understanding throughout the organization.

Leveraging AI enterprise search solutions for efficient and confident information access

One of the standout applications of AI within the public sector is improving information access through enterprise search solutions. AI-powered search tools within Google Workspace can easily access vast amounts of data to find relevant documents, emails, and files. This capability is particularly transformative for government agencies, where quickly retrieving and correlating information can influence policy-making and public service delivery.

Raman Pugalumperumal, senior IT officer and lead for AI and ML platforms at the World Bank, discussed how Vertex AI and Google Cloud Search have revolutionized their data management practices. The World Bank, which manages extensive financial and economic analysis datasets, has benefited from the enhanced speed and accuracy these tools provide.

“We can measure things with quantitative information… we’re able to do [certain things] faster, or maybe things which we weren’t able to do — they’re able to do it because of the volume process,” said Pugalumperumal.

Pugalumperumal explained how AI is being used to quicken information retrieval, creating a more responsive and productive environment. This shift towards leveraging AI in its operations has unlocked new avenues for global access and sharing the World Bank’s wealth of knowledge, positioning AI as a pivotal asset in its mission to distribute developmental knowledge.

At UNFPA, IT Director Nayanesh Bhandutia said they’re working on developing an AI-powered search experience product. “We aim to break the data silos. We don’t want our users to worry about the data source when they’re looking for something,” said Bhadutia.

“This will be very time-saving because now the global population is not going through the pain of finding information.”

Maintaining the flow of multilingual work with AI-assisted translation 

Another significant advantage of integrating AI with Google Workspace in the public sector is overcoming language barriers. AI-driven language translation tools embedded within Google Workspace allow government employees from different linguistic backgrounds to collaborate effectively.

At UNFPA, IT Director Nayanesh Bhandutia highlighted the transformative role of the Gemini AI interface within Google Workspace. Introduced initially to simplify operations, Gemini has evolved to solve more complex challenges, particularly in multilingual settings. The AI-driven tool has been instrumental in helping staff draft clear and concise communications in English, Spanish, French, and Arabic.

“The introduction of Gemini has solved the [fluency] problem. Our users are getting more confident, and they’re spending less time making revisions, but we want to take it to the next level. We noticed many potentials,” said Bhandutia.

The potential for AI to extend beyond basic translations to fully integrated document management systems is vast. Bhandutia shared ambitious plans to leverage Gemini AI to automate the generation of critical documents, such as requests for proposals and job descriptions, which would reduce administrative overhead and enhance responsiveness.

For example, teams can use AI to translate documents and emails directly within the Google Workspace environment when collaborating on international aid programs or global policy initiatives. This seamless integration of translation services helps maintain the flow of work without the interruptions typically caused by language differences, fostering stronger connections and more cohesive teamwork.

“It is a fantastic stepping stone in the technology sector — [the capability] to deliver what people need…this is an excellent step towards accessibility,” said Waugh.

The future of AI and public sector innovation

The ongoing advancements in AI are expected to introduce more sophisticated tools for predictive analytics, supporting complex decision-making and personalized public services. These developments will not only drive greater efficiency within agencies but also enhance the quality of services provided to the public.

By leveraging these tools, government agencies are enhancing their operational capabilities and setting new standards for accessibility, efficiency, and collaboration in public service.

The post Breaking silos worldwide; how Google Cloud is fueling public sector AI, collaboration and innovation appeared first on FedScoop.

]]>
77269
Reimagining search: How AI and Google Search turbocharges patent examinations at USPTO  https://fedscoop.com/reimagining-search-how-ai-and-google-search-turbocharges-patent-examinations-at-uspto/ Tue, 16 Apr 2024 18:00:00 +0000 https://fedscoop.com/?p=77277 U.S. Patent and Trademark Office examiners needed a new approach to sifting through mountains of supporting evidence. Leaders from USPTO, Google and Accenture Federal Services leaders discuss how AI and Google Search are solving the challenge

The post Reimagining search: How AI and Google Search turbocharges patent examinations at USPTO  appeared first on FedScoop.

]]>
One of the many challenges government agencies and their employees face is finding the information they need, when they need it, and having confidence the information is correct, up to date and they haven’t missed essential data.

While advances in search technology have provided government employees with more powerful search tools, the dramatic growth of multi-modal data in all its forms has made search, and the ability to find the right information in petabytes of datasets more challenging than ever.

That was the challenge the United States Patent and Trademark Office and its patent examiners were facing, setting the stage for taking a new approach to search, enabled by advanced AI technologies. 

In a strategic partnership with Accenture Federal Services and Google Cloud, USPTO has developed and implemented a comprehensive system to refine its search mechanisms. This initiative has been about upgrading the traditional examination protocols, providing examiners with new, swift and precise search capabilities that respond to the complexity and scale of modern innovations.

At Google Cloud Next ’24, Jonathan Horner, supervisory patent IT specialist at the U.S. Patent and Trademark Office and Ian Wetherbee, software engineer at Google, joined Anna Hoffman, USPTO lead at Accenture Federal Services, on stage to discuss the agency’s ambitious efforts to leverage AI.  

The USPTO’s initiative highlights a broader challenge public agencies face in reviewing mountains of documents, artifacts and existing application decisions—and identifying where else in the world similar work may be underway and what’s verifiable. Until recently, Generative AI platforms had limited ability to provide grounded or verifiably sourced content in real time from the Internet.

“One of the things we recently announced is Grounding with Google Search,” said Katharyn White, head of public sector marketing for Google Cloud in a podcast from Google Cloud Next.

“Grounded means we know the source that the AI is using to come up with the answer. It’s grounded in a data source that you can check and ensure that is giving you the results that you want. And we’re making that easier.” 

For the USPTO, the need for advanced search capabilities meant first tackling its internal data retrieval and analytics capabilities.

Horner detailed the constitutional roots of patent law and the monumental task of examining each application against all human knowledge. “That’s a lot of information to go through… You’re looking for a needle in a stack of other needles,” Horner said, explaining the enormity of their challenge.

Traditionally, patent examiners relied on Boolean search techniques. However, with the exponential increase in information, these tools became increasingly inadequate for maintaining the high standards required for U.S. patents, said Horner. To address this, the USPTO has turned to AI, deploying tools in production that are not only efficient but also explainable, respecting the office’s duty to the public and applicants.

Hoffman discussed the journey starting in 2019 with a small prototype aimed to demonstrate that AI could meet these challenges. She mentioned conducting dozens of interviews and workshops, deploying a modern Google infrastructure and launching a prototype within three months—a pace unheard of in federal government operations. The prototype focused on finding prior art — evidence that an invention might already exist — that examiners might likely have missed otherwise.  The pilot paved the way for production features like showing “more like this” documents, enabling examiners to find similar documents more effectively.

“This feature became used by examiners immediately, which allowed us to run with a much bigger and more robust AI user interface directly similar to the examiner search system called Similarity Search,” Hoffman added. 

Google’s Wetherbee emphasized the necessity of “supporting the full Boolean search syntax as the model’s input.” A robust data collection process involved over a million data points from human raters and a pattern corpus of over 170 million documents. 

“There are hundreds of millions of citations inside patterns. It’s a huge corpus of over two terabytes of text content…We were able to process all of this human-rated data and the pattern data using Google infrastructure and turn that into training data to train our models,” said Wetherbee. 

Horner reiterated that despite technological advancements, the examiner is “still in the driver’s seat. All of these tools are based on an examiner’s ability to guide the AI towards what it is looking for, and that’s very important to us.” It’s a symbiotic relationship where AI extends the reach of human capability rather than replacing it.

Adopting these AI tools signifies a broader shift within the federal landscape—embracing cutting-edge technology to ensure accuracy and efficiency in governmental functions. It also poses an example for other federal agencies that are considering a similar path toward digital transformation.

Learn more about how Google Public Sector can help your organization “Kickstart your generative AI journey.”

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post Reimagining search: How AI and Google Search turbocharges patent examinations at USPTO  appeared first on FedScoop.

]]>
77277
How Google Cloud AI and Assured Workloads can enhance public sector security, compliance and service delivery at scale https://fedscoop.com/how-google-cloud-ai-and-assured-workloads-can-enhance-public-sector-security-compliance-and-service-delivery-at-scale/ Mon, 15 Apr 2024 22:00:00 +0000 https://fedscoop.com/?p=77239 Google Cloud’s expanding AI capabilities empower government agencies to better manage complex security, regulatory and data privacy challenges.

The post How Google Cloud AI and Assured Workloads can enhance public sector security, compliance and service delivery at scale appeared first on FedScoop.

]]>
The public sector’s IT modernization journey into the cloud is taking a new and revolutionary turn as agency leaders grapple with how to harness AI’s power to help them securely manage the volume and velocity of their workloads.

One challenge that remains at the forefront of those efforts is ensuring that today’s increasingly dynamic and distributed IT environments continue to meet the government’s complex security, regulatory and data privacy compliance rules — while learning how best to capitalize on AI’s potential to serve the public.

Google Cloud’s understanding and recognition of those challenges was widely reflected in a series of sweeping announcements at last week’s Google Cloud Next ’24, that promise new levels of security, flexibility and AI-assisted capabilities to Google Cloud’s public sector customers.

Building AI capabilities within protected workspaces

When it comes to securely managing public sector data, agencies using Google Cloud gain immediate benefits by building on top of its foundational architecture. Because the architecture was built for the cloud and also incorporates a substantial portion of federal security controls, it’s possible to demonstrate security compliance and obtain operating authority in weeks instead of months when folding in applications like Workspace or AI models like Gemini.

Another way agencies can enhance the security of their workloads is by using the Google Cloud Assured Workloads, which also have foundational government security compliance assurances built in, according to a panel of technology experts speaking at Google Cloud Next ’24.

The panelists, representing NASA, Palo Alto Networks, SAP and Google Cloud, argued that using zero-trust and compliance-as-a-code technologies has become essential to creating and maintaining easily reproducible compliant workload environments. That’s in part because of the diversity of government agency compliance requirements, from FedRAMP to the Department of Defense Impact Level 2, 4, and 5 security controls. 

By deploying workloads in pre-certified, software-defined environments set up to limit activity to compliant products and restrict where data can flow and who can access it, agencies can better ensure their workloads meet government requirements.

“Moving to Assured GCP is not just an upgrade; it’s a transformational leap forward,” said Collin Estes, the CIO of MRI Technologies working at NASA.

He pointed to two benefits: The “ability to generate compliant documentation as both a product of these large language models as well as helping us produce very well-structured definitions of what we’re doing, based on your actual implementations within Google Cloud. It is not a human saying, here’s what we do. It is us generating what we do from our environment. I think that’s going to really change the game in terms of how federal agencies manage risk across these portfolios.”

Among other benefits, the panelists pointed to:

Streamlining software development – Transitioning to Assured GCP allows government bodies to leverage and deploy cutting-edge technologies and methodologies, such as containerization and microservices, with unprecedented ease.

Focusing on the mission – By moving to Assured GCP, organizations can shift their focus from the backend to what truly matters—their mission. This shift represents not just an operational change but a philosophical one, where technology becomes an enabler rather than a hurdle in support of agency missions.

According to Palo Alto Networks Senior Manager Michael Clark, another reason for adopting Assured Workloads is the volume of data and the compute intensity with all this data. “We’re at that critical pivot point. We’ve been using this data to learn new threats and find zero-day threats so that we can enforce zero trust, improve security protection mechanisms, and map into new areas of innovation for threat detection and automated remediation.”

When building a compliant environment, SAP’s NVP Architecture and Product Launch, Hunter Downey, urged session attendees “to build it within a framework that I can ensure controls are in place, so I can rinse and repeat across 20 to 100 different teams, potentially touching 1,000 or 5,000 developers. If you start with the lowest common denominator, you’re going to fail. The reason why we partnered with GCP Assured Workloads is because you’re able to control the flow of information and messages. The minute the data goes global, it’s a different jurisdiction.”

Among other AI-related developments announced at Google Cloud Next ‘24:

  • Gemini for Google Cloud is a new generation of AI assistants for developers, Google Cloud services and applications that help users work and navigate security challenges more effectively.
  • See more announcements here. 

Learn more about how Google Public Sector can help your organization Kickstart your AI and security journey”.

This article was produced by Scoop News Group and sponsored by Google Public Sector. Google Public Sector is an underwriter of AI Week.

The post How Google Cloud AI and Assured Workloads can enhance public sector security, compliance and service delivery at scale appeared first on FedScoop.

]]>
77239
Federal leaders share strategies for mission-critical modernization https://fedscoop.com/federal-leaders-share-strategies-for-mission-critical-modernization/ Tue, 02 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=76933 Hear from federal and industry leaders who are at the forefront of integrating advanced technologies to elevate the efficiency and effectiveness of government operations, as shown through a series of initiatives across different agencies.

The post Federal leaders share strategies for mission-critical modernization appeared first on FedScoop.

]]>
In an era where technology and automation are transforming various sectors, federal agencies are embarking on ambitious modernization efforts to enhance their operations and mission outcomes.

In a new video series, “Breaking Through for Mission Advantage,” federal and industry leaders share their insights on how the strategic shift to integrated, software-defined IT platforms helped or will help agencies achieve greater mission outcomes.

The Defense Logistics Agency, under the guidance of Chief Information Officer Adarryl Roberts, is spearheading a digital business transformation initiative to streamline efficiency across its supply chain operations. This initiative includes various projects designed to leverage technology to improve the distribution, disposition and supply of critical materials ranging from subsistence goods to construction equipment. By embracing automation and developing a citizen developer program, the DLA enables operational innovations, directly impacting the speed and efficiency of its services.

Similarly, the Navy’s integration of a software-defined approach into its combat systems, highlighted by Cindy DeCarlo, director of global government education and national security at Cisco, signifies a shift toward more agile and secure defense operations. The approach facilitates quick adaptation to mission requirements and ensures enhanced security by consistently enforcing advanced security controls across the infrastructure.

Furthermore, the IRS’s technological overhaul, as shared by Kaschit Pandya, acting chief information security officer, illustrates the transformative power of IT investments in public service delivery. Modernization efforts have led to significant improvements in taxpayer services, including faster refunds, reduced call wait times, and heightened cybersecurity measures, showcasing the tangible benefits of embracing digital transformation.

At the General Services Administration, Ann Lewis, director of technology transformation services, highlighted the complexities of infrastructure modernization, including the importance of making hard choices about system features and functionalities during the modernization process. She advocated for a platform approach to overcome the siloed nature of government systems, stressing that it enables better scalability, simplifies user experiences and ultimately leads to more sustainable system costs.

In the DOD, Rob Vietmeyer, chief software officer, and Dave McKeown, deputy CIO, talked about software’s transformative power and cybersecurity modernization. Vietmeyer focused on the agility and operational efficiency gained through cloud computing, illustrating the department’s push toward rapid deployment capabilities and a more innovative operational approach. McKeown outlined key initiatives in cloud adoption, DevSecOps and zero trust architecture to combat advanced threats and ensure system resilience. 

The strategic shift toward integrated, software-defined platforms is not just about enhancing operational efficiency; it represents a fundamental rethinking of how federal agencies operate in the digital age, ensuring they remain agile, secure and capable of meeting the nation’s needs.

Other participants in the video series include:

This video series was produced by Scoop News Group, for FedScoop and sponsored by Cisco.

The post Federal leaders share strategies for mission-critical modernization appeared first on FedScoop.

]]>
76933