Microsoft Archives | FedScoop https://fedscoop.com/tag/microsoft/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Wed, 05 Jun 2024 15:49:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 Microsoft Archives | FedScoop https://fedscoop.com/tag/microsoft/ 32 32 How the ‘third wave’ of AI is transforming government operations https://fedscoop.com/how-the-third-wave-of-ai-is-transforming-government-operations/ Thu, 23 May 2024 19:30:00 +0000 https://fedscoop.com/?p=78468 Microsoft’s William Chappell talks about the progression from ‘bespoke to foundational AI’ and how it drives scientific discovery and enhances cybersecurity across the government.

The post How the ‘third wave’ of AI is transforming government operations appeared first on FedScoop.

]]>
Over the years, numerous new technologies and tools have transformed the way government work is accomplished. But artificial intelligence is proving to be a real game-changer, revolutionizing the way agencies execute their missions.

During a presentation at AITalks in April 2024, Dr. William Chappell, VP and CTO of Microsoft’s Strategic Missions and Technologies Division, discussed the three waves of AI in government and what the evolution from bespoke to foundational AI means for agency operations.

The three waves of AI

According to Chappell, the first wave of AI in government was characterized by efforts to impart human expertise onto hardware, a painstaking process that relied heavily on scripted instructions.

Then, the advent of GPUs ushered in the second wave, accelerating AI’s capabilities beyond human levels in recognizing events and objects. “However, it was the underpinning hardware advancements that truly propelled AI forward, setting the stage for the third wave,” Chappell said.

The third wave signifies a significant shift toward contextual adaptation, where AI models possess a broad understanding rather than being tailored to specific applications. According to Chappell, this shift — predicted by the Defense Advanced Research Projects Agency in 2017 — marks a turning point in AI development. “No longer confined to bespoke models, AI now serves as a foundational tool with myriad applications across diverse domains,” he said.

What the third wave looks like for government

In this third wave, AI can perceive, learn, abstract and reason, paving the way for unprecedented advancements in government services. At the forefront of this transformation are foundational models like Microsoft’s Azure Quantum Elements, which enables scientists to accelerate the discovery of new materials with unparalleled speed and accuracy. By combining high-performance computing (HPC) with AI, researchers can sift through vast datasets and pinpoint promising candidates for further exploration.

Chappell pointed to a recent example where Microsoft’s Azure Quantum team joined forces with the Pacific Northwest National Laboratory to build a better battery by “developing new materials on a timescale that we haven’t seen before.”

In a matter of months, the group was able to screen more than 30 million potential materials digitally, narrow them down to 20 candidates, and then develop a revolutionary battery with 70% less lithium. “In the past, this would have taken years to develop,” Chappell said. “Examples like this will help change some of the biggest challenges that we have as a country.”

And the impact of AI in this third wave extends beyond scientific endeavors. Microsoft’s Planetary Computer, for instance, harnesses vast datasets from sources such as NASA, NOAA and the European Space Agency. Through natural language interactions, users can effortlessly navigate and glean insights from mountains of data, revolutionizing how information is accessed and utilized. While this demo was important for geospatial applications that make the data collected from satellites more valuable, this is representative of how any agency in the government can find answers from within the vast amount of data that they possess.

In addition, AI’s capabilities extend to cybersecurity, where it has become instrumental in identifying vulnerabilities in code that elude human detection. “That shift has happened over the last six months — and that is a very big deal for the U.S. government,” Chappell said. Initiatives like DARPA’s AI Cyber Challenge, which Microsoft supports, illustrate AI’s power in fortifying cyber defenses, offering a glimpse into a future where AI safeguards critical infrastructure.

As agencies navigate this new world of AI, Chappell says collaboration, experimentation, and ethical stewardship will be the guiding beacons, ensuring AI serves as a force for positive change in government operations. “The era of experimentation beckons, where users are empowered to shape AI’s trajectory and leverage its capabilities to fulfill their missions,” he said.

Learn more about how Microsoft can empower your organization with advanced AI.

This article was produced by Scoop News Group and sponsored by Microsoft.

The post How the ‘third wave’ of AI is transforming government operations appeared first on FedScoop.

]]>
78468
Labor Department releases principles on AI and workers, with pledges from Microsoft, Indeed https://fedscoop.com/labor-department-releases-principles-on-ai-and-workers-with-pledges-from-microsoft-indeed/ Fri, 17 May 2024 14:45:41 +0000 https://fedscoop.com/?p=78367 The White House says it “welcomes additional commitments” from tech companies on the principles.

The post Labor Department releases principles on AI and workers, with pledges from Microsoft, Indeed appeared first on FedScoop.

]]>

The Biden administration this week released a list of principles meant to govern how workers interact with artificial intelligence. The move comes in response to last year’s AI executive order and will be followed by a new list of best practices expected to be published by the Labor Department. 

The principles focus on values like ensuring responsible use of workers’ data, supporting workers who might need to be upskilled because of artificial intelligence, and committing to using transparency when deploying AI. The principles appear to be voluntary and follow another set of non-binding commitments focused on artificial intelligence announced last July that included pledges from companies like OpenAI and Anthropic.

“Workers must be at the heart of our nation’s approach to AI technology development and use,” acting Labor Secretary Julie Su said in a statement. “These principles announced [Thursday] reflect the Biden-Harris administration’s belief that, in addition to complying with existing laws, artificial intelligence should also enhance the quality of work and life for all workers. As employers and developers implement these principles, we are determined to create a future where technology serves the needs of people above all.”

Microsoft and Indeed, the online job repository platform, have agreed to these principles, according to a press release shared by the White House. The administration seemed to be courting further support for the principles in a post, noting that it “welcomes additional commitments from other technology companies.” 

Notably, the White House recently hosted an event with senior officials from the Labor Department focused on the technology’s impact on workers, according to an IBM executive’s post on LinkedIn.

Neither the White House nor the Department of Labor responded to requests for comment. 

The post Labor Department releases principles on AI and workers, with pledges from Microsoft, Indeed appeared first on FedScoop.

]]>
78367
How the State Department used AI and machine learning to revolutionize records management https://fedscoop.com/how-the-state-department-used-ai-and-machine-learning-to-revolutionize-records-management/ Thu, 16 May 2024 19:34:00 +0000 https://fedscoop.com/?p=77770 A pilot approach helped the State Department streamline the document declassification process and improve the customer experience for FOIA requestors.

The post How the State Department used AI and machine learning to revolutionize records management appeared first on FedScoop.

]]>
In the digital age, government agencies are grappling with unprecedented volumes of data, presenting challenges in effectively managing, accessing and declassifying information.

The State Department is no exception. According to Eric Stein, deputy assistant secretary for the Office of Global Information Services, the department’s eRecords archive system currently contains more than 4 billion artifacts, which includes emails and cable traffic. “The latter is how we communicate to and from our embassies overseas,” Stein said.

Over time, however, department officials need to declare what can be released to the public and what stays classified — a time-consuming and labor-intensive process.

Photo of Eric Stein, U.S. State Department Eric Stein, deputy assistant secretary for the Office of Global Information Services,
Eric Stein, Deputy Assistant Secretary, Office of Global Information Services, U.S. Department of State

The State Department has turned to cutting-edge technologies like artificial intelligence (AI) and machine learning (ML) to find a more efficient solution. Through three pilot projects, the department has successfully streamlined the document review process for declassification and improved the customer experience when it comes to FOIA (Freedom of Information Act) requests.

An ML-driven declassification effort

At the root of the challenge is Executive Order 13526, which requires that classified records of permanent historical value be automatically declassified after 25 years unless a review determines an exemption. For the State Department, cables are among the most historically significant records produced by the agency. However, current processes and resource levels will not work for reviewing electronic records, including classified emails, created in the early 2000s and beyond, jeopardizing declassification reviews starting in 2025.

Recognizing the need for a more efficient process, the department embarked on a declassification review pilot using ML in October 2022. Stein came up with the pilot idea after participating in an AI Federal Leadership Program supported by major cloud providers, including Microsoft.

For the pilot, the department used cables from 1997 and created a review model based on human decisions from 2020 and 2021 concerning cables marked as confidential and secret in 1995 and 1996. The model uses discriminative AI to score and sort cables into three categories: those it was confident should be declassified, those it was confident shouldn’t be declassified, and those that needed manual review.

According to Stein, for the 1997 pilot group of more than 78,000 cables, the model performed the same as human reviewers 97% to 99% of the time and reduced staff hours by at least 60%.

“We project [this technology] will lead to millions of dollars in cost avoidance over the next several years because instead of asking for more money for human resources or different tools to help with this, we can use this technology,” Stein explained. “And then we can focus our human resources on the higher-level and analytical thinking and some of the tougher decisions, as opposed to what was a very manual process.”

Turning attention to FOIA

Building on the success of the declassification initiative, the State Department embarked on two other pilots to enhance the Freedom of Information Act (FOIA) processes from June 2023 to February 2024.

Like cable declassification efforts, handling a FOIA request is a highly manual process. According to Stein, sometimes those requests are a single sentence; others are multiple pages. But no matter the length, a staff member must acknowledge the request, advise whether the department will proceed with it, and then manually search for terms in those requests in different databases to locate the relevant information.

Using the lessons learned from the declassification pilot, Stein said State Department staff realized there was an opportunity to streamline certain parts of the FOIA process by simultaneously searching what was already in the department’s public reading room and in the record holdings.

“If that information is already publicly available, we can let the requester know right away,” Stein said. “And if not, if there are similar searches and reviews that have already been conducted by the agency, we can leverage those existing searches, which would result in a significant savings of staff hours and response time.”

Beyond internal operations, the State Department also sought to improve the customer experience for FOIA requesters by modernizing its public-facing website and search functionalities. Using AI-driven search algorithms and automated request processing, the department aims to “find and direct a customer to existing released documents” and “automate customer engagement early in the request process.”

Lessons learned

Since launching the first pilot in 2022, team members have learned several things. The first is to start small and provide the space and time to become familiar with the technology. “There are always demands and more work to be done, but to have the time to focus and learn is important,” Stein said.

Another lesson is the importance of collaboration. “It’s been helpful to talk across different communities to not only understand how this technology is beneficial but also what concerns are popping up—and discussing those sooner than later,” he said. “The sooner that anyone can start spending some time thinking about AI and machine learning critically, the better.”

Another lesson is to recognize the need to “continuously train a model because you can’t just do this once and then let it go. You have to constantly be reviewing how we’re training the model (in light of) world events and different things,” he said.

These pilots have also shown how this technology will allow State Department staff to better respond to other needs, including FOIA requests. For example, someone may ask for something in a certain way, but that’s not how it’s talked about internally.

“This technology allows us to say, ‘Well, they asked for this, but they may have also meant that,’” Stein said. “So, it allows us to make those connections, which may have been missing in the past.”

The State Department’s strategic adoption of AI and ML technologies in records management and transparency initiatives underscores the transformative potential of these tools. By starting small, fostering collaboration and prioritizing user-centric design, the department has paved the way for broader applications of AI and ML to support more efficient and transparent government operations.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.  To learn more about AI for government from Microsoft, sign up here to receive news and updates on how advanced AI can empower your organization.

The post How the State Department used AI and machine learning to revolutionize records management appeared first on FedScoop.

]]>
77770
Streamlining aid delivery: Lessons from SBA’s digital modernization journey https://fedscoop.com/streamlining-aid-delivery-lessons-from-sbas-digital-modernization-journey/ Mon, 29 Apr 2024 19:30:00 +0000 https://fedscoop.com/?p=77657 How the Small Business Administration’s pivot to a cloud-based CRM platform helped it navigate through the pandemic and transform its approach to customer service.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
America’s more than 32 million small businesses play an indispensable role in driving the U.S. economy. Small businesses account for 43.5% of gross domestic product, employ 61.7 million workers and generate payrolls topping $2.9 trillion, according to government data.

In March 2020, as COVID-19 emerged as a global threat, it became apparent that millions of small businesses were headed into economic peril. While White House officials and lawmakers moved with unusual speed to enact the Coronavirus Aid, Relief and Economic Security (CARES) Act, the task of administering financial aid to small businesses suddenly fell on the U.S. Small Business Administration (SBA).

Legacy challenge

As an independent cabinet agency with fewer than 3,000 employees, the SBA had, until then, managed small business loan and grant applications using an email-based processing and approval system involving shared mailboxes built on Microsoft Outlook. The agency’s outdated backend infrastructure had never been designed — and was ill-equipped — to handle the overwhelming volume of relief requests flooding in from all 50 states once the CARES Act was enacted. Inboxes and storage capacities hit their daily caps almost immediately. Customers started to receive “undeliverable” messages. And SBA employees were unable to keep up with the skyrocketing workloads.

Photo of Brian Quay, SBA Program Manager
Brian Quay, SBA Program Manager

SBA’s leadership quickly recognized what many other public and private sector organizations discovered at the onset of the pandemic — to remain effective in an environment of rapidly escalating and fast-changing needs, they needed to transition quickly from their existing operating systems and adopt a more modern and scalable digital solution that could meet their rapidly-changing needs.

Transformative solution

SBA officials turned to a cloud-based customer relationship management (CRM) platform, Microsoft Dynamics 365. The platform not only offered the scalability and customization the SBA needed but also allowed the SBA to implement a wide range of integrated features, including email automation, auto-routing, metrics recognition, storage optimization, spam prevention, app integration, and auditing capabilities.

More fundamentally, the shift to a modern CRM platform enabled the SBA to transition from a series of manual, labor-intensive processes to a more efficient, automated system that could quickly scale to the volume SBA needed.

Improved outcomes

Adopting a modern, cloud-based CRM platform not only helped SBA overcome a host of technology bottlenecks but also resulted in significant improvements in the SBA’s internal operations and customer service. The platform:

  • Centralized all customer interactions and attached documents into a single contact record, saving a significant amount of time previously spent verifying that all required documents had been received.
  • Categorized requests and automated routing, resulting in timelier responses and fewer requests left in limbo.
  • Reduced much of the manual work associated with evaluating requests and eliminated common processing errors, enhancing productivity.
  • Allowed SBA staff to more quickly triage cases and review work origins, notes, updates, and activities that had occurred across multiple teams for faster response.
  • Provided customers with an easier way to submit a standardized inquiry using a convenient web form on my.sba.gov, built on Dynamics 365 Portal, rather than typing out an email. Customers can also schedule appointments through a Microsoft Power Pages Portal (appointment.sba.gov), assigned to SBA staff and fulfilled within the Dynamics 365 Customer Service platform.

By making it easier to integrate apps and implement a knowledge base reference library, the modernization effort also allowed the SBA to consolidate information from various sources and streamline the decision-making process. That effort was further enhanced with the creation of a Tier 1 dashboard for individual users and a Tier 2 dashboard for team leads to track overall caseloads, giving SBA staff the ability to make data-driven decisions faster and adapt to changing circumstances.

Mission modernization

Moving to a scalable, cloud-based CRM platform helped the SBA rally quickly in response to the sudden flood of aid requests. It also catapulted the SBA’s ability to meet its broader mission of serving and supporting small businesses.

In particular, the new platform made it possible for the SBA to manage activities more effectively with — and gain deeper insights about — more than 11 million individuals in its contact list.

“We can come to the campaign tabs [on the dashboard] and see a list of all of the different campaigns that the SBA has created inside of the platform,” explained SBA Program Manager Brian Quay. The software allows SBA staff to roll up all the cases associated with a contact record and even view image files to validate what information has been provided. It also allows SBA staff to see the status and performance of various marketing campaigns and activities.

“We can see…the number of members that were on [a particular] marketing list, how many messages were successfully sent to them, and failures. This is something that has been a huge productivity gain for SBA [staff], who were previously mainly sending those emails out through Outlook without an ability to track success,” the official said. Altogether, the platform helped SBA create and send more than 50 million templated outreach email from February to September 2023.

Another dimension of the SBA’s customer service modernization is the implementation of Power BI dashboards natively embedded into Dynamics 365. This allows executives who aren’t trained to use Dynamics to still access the metrics it provides by leveraging Power BI on the web or their mobile devices.  

Within two and a half years, the SBA expanded the platform from four mailboxes to over 200 individual inboxes, used by close to 80 teams with an unprecedented volume of activity. According to recent estimates, the platform now tracks over 20 million cases to date and has resulted in operational cost savings of over $25 million.

Lessons learned

The SBA’s transition from an email-based tracking system to a cloud-based CRM platform yielded several valuable lessons for federal executives considering a similar transformation:

Firstly, the importance of scalability cannot be overstated. In a crisis situation, the ability to quickly scale up operations is crucial, and a flexible digital platform can make all the difference.

Secondly, customization matters. Tailoring the system to the agency’s unique needs ensures maximum efficiency and usability.

Thirdly, integration capabilities are a game-changer. The ability to connect different tools and data sources creates a unified ecosystem, enabling faster decision-making.

Lastly, automation is a key enabler of efficiency. By automating routine tasks, agencies can focus their efforts on high-impact activities and respond swiftly to emerging challenges.

The Small Business Administration’s journey to digital modernization also demonstrates that in a rapidly evolving world, embracing innovative solutions is not just an option; it’s a necessity to empower organizations to thrive, grow, and support those they serve.

The report was produced by Scoop News Group for FedScoop, as part of a series on innovation in government, underwritten by Microsoft Federal.

The post Streamlining aid delivery: Lessons from SBA’s digital modernization journey appeared first on FedScoop.

]]>
77657
How cloud modernization transformed OPM cybersecurity operations https://fedscoop.com/how-cloud-modernization-transformed-opm-cybersecurity-operations/ Tue, 27 Feb 2024 20:27:00 +0000 https://fedscoop.com/?p=76126 By shifting to cloud-native solutions, the U.S. Office of Personnel Management has significantly enhanced its underlying security infrastructure to better protect the agency from evolving cyber threats.

The post How cloud modernization transformed OPM cybersecurity operations appeared first on FedScoop.

]]>
Few organizations in the world provide human resource services at the scale of the U.S. Office of Personnel Management (OPM). OPM oversees personnel management services for 2.2 million federal workers — and the retirement benefits for another 2.7 million annuitants, survivors, and family members. Because the agency also manages the federal workforce’s recruiting, hiring, and benefits management, OPM is responsible for handling vast amounts of sensitive data, making it a prime target for cyberattacks. 

Following a massive data breach in 2015, OPM instituted a comprehensive overhaul of its IT and security practices. However, in the years since, it became increasingly clear that without modernizing its underlying IT infrastructure, many of the remedies OPM put in place were becoming outmoded in the face of ever more sophisticated cyberattacks.

That was especially apparent to Guy Cavallo, who arrived at OPM in the fall of 2020 as principal deputy CIO after leading sweeping IT modernization initiatives at the Small Business Administration (SBA) and before that at the Transportation Security Administration (TSA). He was named OPM’s CIO in July 2021.

Recognizing new cyber challenges

“We looked at the on-premises cyber tools that OPM was running since the breach and saw while they were effective, with today’s advancements in AI and cyber capabilities, they weren’t keeping up with the attack vectors we’re facing today,” said Cavallo in a recent interview. Threat actors had shifted to identity-based attacks using more sophisticated tactics, requiring advanced detection and response solutions.

Guy Cavallo, CIO, OPM

“We knew with AI coming and the Executive Order on Cybersecurity requiring logging to get visibility into your environment, investing in on-premises hardware would be a never-ending battle of running out of storage space,” he concluded.

The cloud was “the ideal elastic storage case for that,” he continued. But it also offered other critical solutions. The cloud was the ideal way to host applications to ensure “that we’re always up to date on patching and versions, leaving that to the cloud vendors to take care of — something that the federal government struggles with,” he said.

Checklist for a better solution

Cavallo wanted to avoid the mistake he had seen other organizations make, trying to weave all kinds of tools into an enterprise security blanket. “It’s incredibly difficult to integrate them and not have them attack each other — or also not have gaps between them,” he said. “I’m a believer that simpler is much better than tying together best-of-breed from multiple vendors.”

James Saunders, CISO, OPM

That drove Cavallo and OPM Chief Information Security Officer James Saunders to pursue a fundamental shift to a cloud-native cybersecurity platform and “making that the heart of our security apparatus,” said Saunders.  

After reviewing the options, they elected to move to Microsoft’s Azure cloud-based cybersecurity stack “so that we can take advantage of the edge of cloud, and cloud in general, to collect data logs.” Additionally, it would mean “We didn’t have to worry about software patching and ‘Do I have enough disk space?’ It also allows us to springboard into more advanced capabilities such as artificial intelligence,” Saunders said.

Because OPM exchanges data with many federal agencies that rely on different data systems, Cavallo and Saunders also implemented a cloud access security broker (CASB) — a security policy enforcement engine that monitors and manages security activity across multiple domains from a single location. It also “enables our security analysts to be more efficient and identify threats in a more holistic manner,” Saunders explained.

Added benefits

“There is a general misconception that you can only use cloud tools from the host vendor to monitor and protect that environment.  We found that leveraging cyber defenses that span multiple clouds is a better solution for us instead of having multiple different tools performing the same function,” Cavallo added.

Microsoft’s extensive threat intelligence ecosystem and the ability to reduce the number of contracts OPM has to maintain were also critical factors in their decision to move to Azure, Saunders added.

The pay-off

The migration from on-premises infrastructure to the cloud was a complex process involving the retirement of more than 50 servers and the decommissioning of multiple storage areas and SQL databases, according to Saunders. The most challenging aspect, though, was not the technology but managing the transition with the workforce. Extensive training and organizational change management were as critical as the technical migration to the success of the transition.

According to Saunders, the benefits didn’t take long to recognize:

  • Enhanced visibility: OPM now has a more comprehensive view of its security posture, thanks to the centralized platform and increased log collection.
  • Improved threat detection and response: AI-powered tools and Microsoft’s threat intelligence helps OPM identify and respond to threats faster and more effectively.
  • Reduced costs and complexity: Cloud-native solutions eliminate the need for buying expensive on-premises hardware and software, while also simplifying management and maintenance.
  • Increased scalability and agility: The cloud platform allows OPM to easily scale its security infrastructure as needed to meet evolving threats and business requirements.

Collectively, those and related cloud benefits are also helping OPM make faster headway in meeting the administration’s zero-trust security goals.

Lessons learned

Perhaps one of the most important benefits is being able to demonstrate the magnitude and nature of today’s threat landscape to the agency’s leadership and how OPM is much better prepared to defend against it, according to Cavallo.

“When James and I showed them the visibility that we have from all those logs, it was a drop-the-mic moment for them. We can say we blocked 4,000 attacks in the last hour, but until you actually show them a world map and our adversaries trying to get into OPM, then be able to click and show the real details of it — those threats get lost in the noise,” he said.

“My recommendation at the CIO level is, this is a better mousetrap. But you can’t just expect people to flock to it. You have to go show them why it’s a better mousetrap.”

Among the other lessons Cavallo recommends to fellow IT leaders:

  • Focus on simplicity: Choose a single, integrated security platform to avoid the complexity of managing multiple tools.
  • Invest in training: Ensure your staff is trained and familiar with new cloud-native security tools and processes.
  • Start small and scale gradually: Begin with a pilot project and gradually migrate your security infrastructure to the cloud.
  • Communicate effectively: Clearly explain the benefits of cloud-native security to your stakeholders and address any concerns.

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How cloud modernization transformed OPM cybersecurity operations appeared first on FedScoop.

]]>
76126
How Azure Orbital and the cloud are expanding our worldview https://fedscoop.com/how-azure-orbital-and-cloud-are-expanding-our-worldview/ Tue, 06 Feb 2024 20:30:00 +0000 https://fedscoop.com/?p=75909 A new report highlights how the convergence of space and cloud technologies contributes to a ‘supernova’ of new space-based Earth-observation capabilities — and benefits for federal and commercial enterprises.

The post How Azure Orbital and the cloud are expanding our worldview appeared first on FedScoop.

]]>
The rapid expansion of low Earth orbit satellite constellations, combined with a growing network of ground-based cloud computing centers, has brought space industrialization to a historic inflection point, according to a new report.

A record 2,897 satellites were launched into orbit around the Earth by more than 50 countries last year, according to Jonathan McDowell, an astronomer and astrophysicist known for documenting space activity. An even greater number are expected to be launched in 2024.

All of that contributes to a supernova of new space-based communications and Earth-observation sensor capabilities, says Stephen Kitay, a former Pentagon deputy assistant secretary for space policy, now senior director of Azure Space at Microsoft.

Download the full report.

“A huge transformation is happening in space — and the technology that was never there before — effectively extending the internet and edge computing into space,” Kitay said in the report, produced by Scoop News Group and underwritten by Microsoft.

What’s been missing until recently, he says, is a reliable and secure way to manage and transmit the explosive growth of satellite data being collected in space and the means to automate and manage satellite activities more efficiently.

That’s changing as a new era of secure, scalable cloud computing centers strategically located around the globe is developing to stay connected to all those satellites — along with a new generation of software platforms to manage the devices, applications, and data on board all of them, according to the report.

How federal agencies stand to benefit

The report highlights the rise of hybrid space architecture, which Microsoft helped pioneer under the Azure Space banner launched in 2020. The concept involves “bringing cloud and space technologies together to foster a partner ecosystem,” explained Kitay. That effort has spawned a variety of components, including:

  • Azure Orbital Ground Station – designed to give satellite operators, including government customers, the ability to deliver space data with near-zero latency to Microsoft’s global network of Microsoft and partner ground stations.
  • Azure Orbital Cloud Access – enables a seamless cloud experience anywhere on the planet by combining Microsoft Cloud with low latency satellite and 5G communications.
  • Microsoft Planetary Computer – a multi-petabyte catalog of global open geospatial data with intuitive APIs aimed at helping researchers, scientists, students, and organizations worldwide gain valuable insights from Earth observation data.

At the same time, Microsoft is “bringing our code and our software into space by empowering developers to build applications on the ground in the cloud and then seamlessly deploy them on board spacecraft,” Kitay said.

The report also highlights examples of how federal agencies, such as the U.S. Forest Service, the Environmental Protection Agency, the Department of Agriculture and the Defense Department, among others, stand to gain powerful new insights from Earth observation data to better support their missions.

“Removing the barriers to seamless and secure connectivity from ground to orbit creates entirely new opportunities for federal government customers, including those operating in classified environments,” said Zach Kramer, vice president of the Mission Engineering unit at Microsoft.

“Defense and civilian agencies can leverage this ubiquitous connectivity to develop and deploy new applications, gather and transmit data at the speed of relevance, and gain an information advantage to serve the American people.”

Download the full report.

This article was produced by Scoop News Group, for FedScoop and underwritten by Microsoft.


The post How Azure Orbital and the cloud are expanding our worldview appeared first on FedScoop.

]]>
75909
Microsoft makes Azure OpenAI service available in government cloud platform https://fedscoop.com/openai-service-available-government-cloud/ Tue, 06 Feb 2024 14:00:00 +0000 https://fedscoop.com/?p=75932 The service is live on Azure Government Tuesday while the agency pursues FedRAMP authorization for high-impact data.

The post Microsoft makes Azure OpenAI service available in government cloud platform appeared first on FedScoop.

]]>
Federal agencies that use Microsoft’s Azure Government service now have access to its Azure OpenAI Service through the cloud platform, permitting use of the tech giant’s AI tools in a more regulated environment.

Candice Ling, senior vice president of Microsoft’s federal government business, announced the launch in a Tuesday blog post, highlighting the data safety measures of the service and its potential uses for productivity and innovation. 

“Azure OpenAI in Azure Government enables agencies with stringent security and compliance requirements to utilize this industry-leading generative AI service at the unclassified level,” Ling’s post said.

The announcement comes as the federal government is increasingly experimenting with and adopting AI technologies. Agencies have reported hundreds of use cases for the technology while also crafting their own internal policies and guidance for use of generative AI tools.

Ling also announced that the company is submitting Azure OpenAI for federal cloud services authorizations that, if approved, would allow higher-impact data to be used with the system. 

Microsoft is submitting the service for authorization for FedRAMP’s “high” baseline, which is reserved for cloud systems using high-impact, sensitive, unclassified data like heath care, financial or law enforcement information. It will also submit the system for authorization for the Department of Defense’s Impact Levels 4 and 5, Ling said. Those data classification levels for DOD include controlled unclassified information, non-controlled unclassified information and non-public, unclassified national security system data.

In an interview with FedScoop, a Microsoft executive said the availability of the technology in Azure Government is going to bring government customers capabilities expected from GPT-4 — the fourth version of Open AI’s large language models — in “a more highly regulated environment.”

The executive said the company received feedback from government customers who were experimenting with smaller models and open source models but wanted to be able to use the technology on more sensitive workloads.

Over 100 agencies have already deployed the technology in the commercial environment, the executive said, “and the majority of those customers are asking for the same capability in Azure Government.” 

Ling underscored data security measures for Azure OpenAI in the blog, calling it “a fundamental aspect” of the service. 

“This includes ensuring that prompts and proprietary data aren’t used to further train the model,” Ling wrote. “While Azure OpenAI Service can use in-house data as allowed by the agency, inputs  and outcomes are not made available to Microsoft or others using the service.”

That means embeddings and training data aren’t available to other customers, nor are they used to train other models or used to improve the company’s or third-party services. 

According to Ling’s blog, the technology is already being used for a tool being developed by the National Institutes of Health’s National Library of Medicine. In collaboration with the National Cancer Institute, the agency is working on a large language model-based tool, called TrialGPT, that will match patients with clinical trials.

The post Microsoft makes Azure OpenAI service available in government cloud platform appeared first on FedScoop.

]]>
75932
How cloud modernization helps FERC streamline its regulatory processes https://fedscoop.com/how-cloud-modernization-helps-ferc-streamline-its-regulatory-processes/ Mon, 29 Jan 2024 20:30:00 +0000 https://fedscoop.com/?p=75709 A novel tech-challenge approach helped IT leaders at the Federal Energy Regulatory Commission start the overhaul of its legacy applications and improve customer service.

The post How cloud modernization helps FERC streamline its regulatory processes appeared first on FedScoop.

]]>
Upgrading and expanding the nation’s electrical grid isn’t just about meeting America’s growing electrical demands; it’s about powering our economy, securing national security, and ensuring a cleaner future for future generations. But because so much of that infrastructure lies mainly in private hands, orchestrating that effort requires extraordinary attention.

That is one of the roles of the Federal Energy Regulatory Commission (FERC), an independent agency within the Department of Energy that regulates the interstate transmission of electricity, natural gas, and oil. FERC also reviews and licenses liquified natural gas terminals, hydropower projects, and pipelines.

Ensuring that the companies building and operating power plants, pipelines and transmission lines adhere to safety standards, comply with environmental laws, and abide by market-based pricing guidelines requires an extensive review and approval process. And because FERC relies on approximately 1,570 employees to perform that work, technology plays a critical role in keeping on top of all those entities’ requests.

The challenge: Legacy technology

Michelle Pfeifer, Director of Solutions Delivery and Engineering, FERC.

FERC’s technology systems, however, like those at many federal agencies, have been hard-pressed to keep up with ongoing and emerging demands. Most of those systems used to manage the core applications the agency depends on are more than ten years old and stove-piped, according to Michelle Pfeifer, Director of Solutions Delivery and Engineering.

Among other challenges, the workload management systems used to process and manage filings from regulated entities operate on outdated, customized platforms, leading to inefficiencies in tracking and managing the significant number of filings the agency must handle, said Pfeifer, who joined FERC four years ago. “We have done some updates, but there are a significant number of requests for refresh or modernization that have not been addressed. Additionally, data had to be entered into multiple systems, compounding workload challenges,” she said.

The search for a better solution

FERC’s IT team recognized the solution required more than a technology refresh. So they decided to launch an “application layer modernization program to address pent-up demand from our customers, address the stovepipe nature of multiple applications, and do it more quickly and flexibly through an agile delivery process. And we definitely wanted a cloud-based solution,” she said.  “We also were looking at — instead of custom development, which is what we had — going to more of a low-code, no-code solution that gives us more pre-built capability. “

After evaluating a series of vendor demonstrations and completing the acquisition process, FERC’s IT team selected Microsoft’s Power Platform, a set of low-code tools that help create and automate solutions, to modernize the applications.  After conducting an application rationalization review, FERC defined a phased approach to modernize its applications. The first phase, which is complete, developed a Virtual Agenda system that supports the Commission voting process on energy matters.  FERC is now in the second phase, migrating its workload management and hydro project systems.  All the modernized systems operate on Microsoft Azure Government Community Cloud (GCC) environments, according to Pfeifer.

Wholesale improvements 

The first phase of modernization efforts, which went live in August, has already led to improvements for FERC employees, according to Pfeifer.

“The biggest improvement areas were greater integration of the workflows within the new system,” she said. Right away, there was less rekeying of data and fewer manual errors. Another significant improvement was “the automated generation of fields or documents that had previously been done manually,” she explained.

The new layer of automated workflow tracking provides more comprehensive visibility into the status of FERC dockets and reviews, which eventually flow up for final decisions by FERC’s five-member board of commissioners. The new system has replaced and consolidated a separate set of Microsoft Sharepoint sites used by the chairman and the commissioners’ staff to track projects in circulation before coming up for Commission decisions.

Externally, as part of future phases, regulated entities will find it easier to submit filings and requests, said Pfeifer. She acknowledged there’s more work to be done to improve FERC’s customers’ overall user experience. However, the cloud-based applications are already improving the agency’s ability to maintain the application and analyze data associated with the Commission proceedings — and puts FERC in a stronger position to leverage AI, said Pfeifer.

Lessons learned

One of the key lessons that helped accelerate FERC’s modernization efforts, according to Pfeifer, was using the acquisition process differently.

“We used some more advanced acquisition techniques — we requested a demo, for instance, as well as did a ‘Tech Challenge’ — which allowed us to see not just a paper document in response to a proposal, but a demo of a solution. That allowed us to work with (different vendor’s teams) to see how they would work together.” The tech challenge also included a tech talent component on top of the demo, “where vendors had to change something (so we could) see how they would go about doing that, what experience they had and what the team was capable of configuring and delivering,” she said.

Another lesson she stressed was the importance of business process mapping and reengineering “so that we could help our customers (define) what they want the processes to do. How do they want the processes to improve? We wanted to model that technically, not model the old processes that they weren’t happy with.”

That would also help the IT team implement the modernization efforts in phases, which was essential to ensuring the transition process went smoothly and minimized disruption to FERC mission.

Added benefits

While measuring the impact of modernizing and migrating to cloud services isn’t always straightforward, Pfeifer sees a number of operational benefits.

“Just to keep track of the status of things requires a lot of side spreadsheets and reports that aren’t part of the actual workflow (and will be incorporated into the FERC workload processing). Having a more streamlined workflow process also allows the user base to understand the due dates and ensure they’re meeting them, which once required substantial effort from the program offices to do that within the existing applications,” she explained. 

The other area that I see a lot of benefit in is consistency in how things are defined and managed, and handled across the different offices within FERC,” which in turn, leads to greater accuracy for decision-making.

Finally, Pfeifer sees these back-end improvements laying the foundation for modernizing the agency’s front-end experience for the regulated entities that rely on FERC, in line with the administration’s executive order on transforming the federal customer experience and service delivery.

“Modernization is hard to achieve because you have to replicate the capabilities of the existing systems — and improve on those capabilities at the same time,” concluded Pfeifer. “That said, sometimes the technical solution is the easier part of the solution.”

This report was produced by Scoop News Group for FedScoop as part of a series on technology innovation in government, underwritten by Microsoft Federal.

The post How cloud modernization helps FERC streamline its regulatory processes appeared first on FedScoop.

]]>
75709
National Science Foundation rolls out NAIRR pilot with industry, agency support https://fedscoop.com/nsf-launches-nairr-pilot/ Wed, 24 Jan 2024 16:00:00 +0000 https://fedscoop.com/?p=75701 The pilot brings together research resources from multiple federal and industry partners and will serve as a “proof of concept” for the full-scale project, according to NSF.

The post National Science Foundation rolls out NAIRR pilot with industry, agency support appeared first on FedScoop.

]]>
The National Science Foundation launched a pilot for the National Artificial Intelligence Research Resource on Wednesday, giving U.S.-based researchers and educators unique access to a variety of tools, data, and support to explore the technology.

The pilot for the resource, referred to as the NAIRR, is composed of contributions from 11 federal agencies and 25 private sector partners, including Microsoft, Amazon Web Services, NVIDIA, Intel, and IBM. Those contributions range from use of the Department of Energy’s Summit supercomputer to datasets from NASA and the National Oceanic and Atmospheric Administration to access for models from OpenAI, Anthropic, and Meta.

“A National AI Research Resource, simply put, has the potential to change the trajectory of our country’s approach to AI,” NSF Director Sethuraman Panchanathan told reporters on a call ahead of the launch. “It will lead the way for a healthy, trustworthy U.S. AI ecosystem.”

The idea for a NAIRR has been under discussion for some time as a way to provide researchers with the resources needed to carry out their work on AI, including advanced computing, data, software, and AI models. Supporters say a NAIRR is needed because the computational resources that AI demands aren’t often attainable for prospective academic researchers.

Katie Antypas, director of NSF’s Office of Advanced Cyberinfrastructure, underscored that need on the call with reporters, saying “the pilot is the first step to bridging this gap and will provide access to the research and education community across our country — all 50 states and territories.”

The launch comes ahead of a requirement in President Joe Biden’s Oct. 30 AI executive order for NSF to establish a pilot project for the resource within 90 days. According to an NSF release and accompanying call with reporters, the two-year pilot will serve as a “proof of concept” for the full-scale resource. 

Creating a pilot that would run parallel to a full buildout was among the options the NAIRR Task Force, which was co-chaired by NSF and the Office of Science and Technology Policy, presented in its implementation framework for the resource roughly a year ago. 

The pilot is divided into four focus areas: “NAIRR Open,” which will provide access to resources for AI research on the pilot’s portal; “NAIRR Secure,” an AI privacy- and security-focused component co-led by DOE and the National Institutes of Health; “NAIRR Software,” which will facilitate and explore the interoperable use of pilot resources; and “NAIRR Classroom,” which focuses on education, training, user support, and outreach.

Antypas said anticipated uses of the pilot might include a researcher seeking access to large models to investigate validation and verification or an educator from a community college, rural, or minority-serving institution who’s able to obtain AI resources for the students in their classroom.

When asked how resources are being vetted for the NAIRR, Antypas said there will be a process for datasets that become part of the resource. “We are going to be standing up an external ethics advisory committee to be providing independent advice on, you know, what are those standards? How do we develop those with a pilot?” Antypas said.

Quality of datasets came into focus recently after a Stanford report flagged the existence of child sexual abuse material on a popular AI research dataset known as LAION-5B. FedScoop previously reported that NSF doesn’t know if or how many researchers had used that dataset — it doesn’t track this aspect of principal investigators’ work — but highlighted the need for a NAIRR to provide researchers with trusted resources.

Among the support from industry, Microsoft is contributing $20 million in compute credits for its cloud computing platform Azure, in addition to access to its models, and NVIDIA is contributing $30 million in support, including $24 million in computing access on its DGX platform.

Some contributions are tied to specific uses. OpenAI, for example, will contribute “up to $1 million in credits for model access for research related to AI safety, evaluations, and societal impacts, and up to $250,000 in model access and/or ChatGPT accounts to support applied research and coursework at Historically Black Colleges and Universities and Minority Serving Institutions,” according to information provided by NSF. Anthropic, meanwhile, is providing 10 researchers working on climate change-related projects with API access to its Claude model.

The list of partners could grow as time goes on. Tess deBlanc-Knowles, special assistant to the director for AI in the Office of the Director at NSF, noted on the call with reporters that the pilot came together on “a really ambitious timeline” and said “it’s important to note that this is just the beginning.”

deBlanc-Knowles said NSF hopes to bring on more partners and add more resources after the launch “so that we can serve more researchers, educators, and more places, and start to really make progress towards that bigger vision of the NAIRR of democratizing AI.”

The post National Science Foundation rolls out NAIRR pilot with industry, agency support appeared first on FedScoop.

]]>
75701
Microsoft’s Brad Smith said AI ‘homework’ from White House helped speed pace of action https://fedscoop.com/microsoft-ai-white-house-davos/ Thu, 18 Jan 2024 15:11:24 +0000 https://fedscoop.com/?p=75620 The tech giant’s vice chair and president complimented White House efforts to see what companies were capable of in terms of AI safety and security during a panel discussion at the World Economic Forum's annual meeting.

The post Microsoft’s Brad Smith said AI ‘homework’ from White House helped speed pace of action appeared first on FedScoop.

]]>
The White House’s engagement with companies on their artificial intelligence capabilities — including giving those partners a “homework” assignment — helped speed up the pace of action on the technology, Microsoft Vice Chair and President Brad Smith said at the World Economic Forum on Wednesday.

When the Biden administration brought four companies, including Microsoft, to the White House in May to discuss AI, it gave those firms “homework assignments” to show what they were prepared to do to address safe, secure, and transparent use of the technology, Smith said on a panel about AI regulation around the world.

Though the assignment was due by the end of the month, Smith recalled that Microsoft was “proud” to have submitted a first draft quickly. The following day, however, the feedback came in.

“We sent it in on Sunday, and on Monday morning I had a call with [White House Office of Science and Technology Policy Director Arati Prabhakar and U.S. Secretary of Commerce Gina Raimondo], and they said, ‘Congratulations, you got it in first. You know what your grade is? Incomplete,’” Smith said. Prabhakar was also on the Wednesday panel in Davos, Switzerland.

The officials, he said, told Microsoft to build upon what they submitted. “And it broke the cycle that often happens when policymakers are saying ‘do this’ and industry is saying ‘that’s not practical.’ And especially for new technology that was evolving so quickly, it actually made it possible to speed up the pace,” Smith said.

Engagement with companies has been a key aspect of the Biden administration’s efforts to develop a U.S. policy for AI use and regulation, including obtaining voluntary commitments from firms that they’ll manage the risks posed by the budding and rapidly growing technology. 

“I don’t think that all of these governments would have gotten as far as they did by December if you hadn’t engaged some of the companies in that way,” Smith said.

Smith’s comment came after Prabhakar addressed the administration’s work with companies on the Wednesday panel, saying that Microsoft and others are on the “leading edge” of the technology. But she also noted that the administration engaged with small companies, civil society, workers, labor unions, and academia.

“I actually think this is an important part of our philosophy of regulation and governance, is not to just do it top-down and sit in our offices and make up answers,” Prabhakar said. “The way effective governance happens is with all those parties at the table.”

The post Microsoft’s Brad Smith said AI ‘homework’ from White House helped speed pace of action appeared first on FedScoop.

]]>
75620