AWS Archives | FedScoop https://fedscoop.com/tag/aws/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Wed, 08 Nov 2023 15:05:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 AWS Archives | FedScoop https://fedscoop.com/tag/aws/ 32 32 Experts warn of ‘contradictions’ in Biden administration’s top AI policy documents https://fedscoop.com/experts-warn-of-contradictions-in-biden-administrations-top-ai-policy-documents/ Wed, 23 Aug 2023 22:51:12 +0000 https://fedscoop.com/?p=72248 AI policy specialists say a lack of guidance from the White House on how to square divergent rights-based and risk-based approaches to AI is proving a challenge for companies working to create new products and safeguards.

The post Experts warn of ‘contradictions’ in Biden administration’s top AI policy documents appeared first on FedScoop.

]]>
The Biden administration’s cornerstone artificial intelligence policy documents, released in the past year, are inherently contradictory and provide confusing guidance for tech companies working to develop innovative products and the necessary safeguards around them, leading AI experts have warned.

Speaking with FedScoop, five AI policy experts said adhering to both the White House’s Blueprint for an AI ‘Bill of Rights’ and the AI Risk Management Framework (RMF), published by the National Institute of Standards and Technology, presents an obstacle for companies working to develop responsible AI products.

However, the White House and civil rights groups have pushed back on claims that the two voluntary AI safety frameworks send conflicting messages and have highlighted that they are a productive “starting point” in the absence of congressional action on AI. 

The two policy documents form the foundation of the Biden administration’s approach to regulating artificial intelligence. But for many months, there has been an active debate among AI experts regarding how helpful — or in some cases hindering — the Biden administration’s dual approach to AI policymaking has been.

The White House’s Blueprint for an AI ‘Bill of Rights’ was published last October. It takes a rights-based approach to AI, focusing on broad fundamental human rights as a starting point for the regulation of the technology. That was followed by the risk-based AI RMF in January, which set out to determine the scale and scope of risks related to concrete use cases and recognized threats to instill trustworthiness into the technology.

Speaking with FedScoop, Daniel Castro, a technology policy scholar and vice president at the Information Technology and Innovation Foundation (ITIF), noted that there are “big, major philosophical differences in the approach taken by the two Biden AI policy documents,” which are creating “different [and] at times adverse” outcomes for the industry.

“A lot of companies that want to move forward with AI guidelines and frameworks want to be doing the right thing but they really need more clarity. They will not invest in AI safety if it’s confusing or going to be a wasted effort or if instead of the NIST AI framework they’re pushed towards the AI blueprint,” Castro said.

Castro’s thoughts were echoed by Adam Thierer of the libertarian nonprofit R Street Institute who said that despite a sincere attempt to emphasize democratic values within AI tools, there are “serious issues” with the Biden administration’s handling of AI policy driven by tensions between the two key AI frameworks.

“The Biden administration is trying to see how far it can get away with using their bully pulpit and jawboning tactics to get companies and agencies to follow their AI policies, particularly with the blueprint,” Thierer, senior fellow on the Technology and Innovation team at R Street, told FedScoop.

Two industry sources who spoke with FedScoop but wished to remain anonymous said they felt pushed toward the White House’s AI blueprint over the NIST AI framework in certain instances during meetings regarding AI policymaking with the White House’s Office of Science and Technology (OSTP).

Rep. Frank Lucas, R-Okla., chair of the House Science, Space and Technology Committee, and House Oversight Chairman Rep. James Comer, R-Ky., have been highly critical of the White House blueprint as it compares to the NIST AI Risk Management Framework, expressing concern earlier this year that the blueprint sends “conflicting messages about U.S. federal AI policy.”

In a letter obtained exclusively by FedScoop, Arati Prabhakar responded to those concerns, arguing that “these documents are not contradictory” and highlighting how closely the White House and NIST are working together on future regulation of the technology.

At the same time, some industry AI experts say the way in which the two documents define AI clash with one another.

Nicole Foster, who leads global AI and machine learning policy at Amazon Web Services, said chief among the concerns with the documents are diverging definitions of the technology itself. She told FedScoop earlier this year that “there are some inconsistencies between the two documents for sure. I think just at a basic level they don’t even define things like AI in the same way.”

Foster’s thoughts were echoed by Raj Iyer, global head of public sector at cloud software provider ServiceNow and former CIO of the U.S. Army, who believes the two frameworks are a good starting point to get industry engaged in AI policymaking but that they lack clarity.

“I feel like the two frameworks are complementary. But there’s clearly some ambiguity and vagueness in terms of definition,” said Iyer.

“So what does the White House mean by automated systems? Is it autonomous systems? Is it automated decision-making? What is it? I think it’s very clear that they did that to kind of steer away from wanting to have a direct conversation on AI,” Iyer added.

Hodan Omaar, an AI and quantum research scholar working with Castro at ITIF, said the two documents appear to members of the tech industry as if they are on different tracks. According to Omaar, the divergence creates a risk that organizations will simply defer to either the “Bill of Rights” or the NIST RMF and ignore the other.

“There are two things the White House should be doing. First, it should better elucidate the ways the Blueprint should be used in conjunction with the RMF. And second, it should better engage with stakeholders to gather input on how the Blueprint can be improved and better implemented by organizations,” Omaar told FedScoop.

In addition to compatibility concerns about the two documents, experts have also raised concerns about the process followed by the White House to take industry feedback in creating the documents.

Speaking with FedScoop anonymously in order to speak freely, one industry association AI official said that listening sessions held by the Office of Science and Technology Policy were not productive.

“The Bill of Rights and the development of that, we have quite a bit of concern because businesses were not properly consulted throughout that process,” the association official said. 

The official added: “OSTP’s listening sessions were just not productive or helpful. We tried to actually provide input in ways in which businesses could help them through this process. Sadly, that’s just not what they wanted.”

The AI experts’ comments come as the Biden administration works to establish a regulatory framework that mitigates potential threats posed by the technology while supporting American AI innovation. Last month, the White House secured voluntary commitments from seven leading AI companies about how AI is used, and it is expected to issue a new executive order on AI safety in the coming weeks.

One of the contributors to the White House’s AI Blueprint sympathizes with concerns from industry leaders and AI experts regarding the confusion and complexity of the administration’s approach to AI policymaking. But it’s also an opportunity for companies seeking voluntary AI policymaking guidance to put more effort into asking themselves hard questions, he said.

“So I understand the concerns very much. And I feel the frustration. And I understand people just want clarity. But clarity will only come once you understand the implications, the broader values, discussion and the issues in the context of your own AI creations,” said Suresh Venkatasubramanian, a Brown University professor and former top official within the White House’s OSTP, where he helped co-author its Blueprint for an ‘AI Bill of Rights.’ 

“The goal is not to say: Do every single thing in these frameworks. It’s like, understand the issues, understand the values at play here. Understand the questions you need to be asking from the RMF and the Blueprint, and then make your own decisions,” said Venkatasubramanian.

On top of that, the White House Blueprint co-author wants those who criticize the documents’ perceived contradictions to be more specific in their complaints.

“Tell me a question in the NIST RMF that contradicts a broader goal in the White House blueprint — find one for me, or two or three. I’m not saying this because I think they don’t exist. I’m saying this because if you could come up with these examples, then we could think through what can we do about it?” he said.

Venkatasubramanian added that he feels the White House AI blueprint in particular has faced resistance from industry because “for the first time someone in a position of power came out and said: What about the people?” when it comes to tech innovation and regulations. 

Civil rights groups like the Electronic Privacy Information Center have also joined the greater discussion about AI regulations, pushing back on the notion that industry groups should play any significant role in the policymaking of a rights-based document created by the White House.

“I’m sorry that industry is upset that a policy document is not reflective of their incentives, which is just to make money and take people’s data and make whatever decisions they want to make more contracts. It’s a policy document, they don’t get to write it,” said Ben Winters, the senior counsel at EPIC, where he leads their work on AI and human rights.

Groups like EPIC and a number of others have called upon the Biden administration to take more aggressive steps to protect the public from the potential harms of AI.

“I actually don’t think that the Biden administration has taken a super aggressive role when trying to implement these two frameworks and policies that the administration has set forth. When it comes to using the frameworks for any use of AI within the government or federal contractors or recipients of federal funds, they’re not doing enough in terms of using their bully pulpit and applying pressure. I really don’t think they’re doing too much yet,” said Winters.

Meanwhile, the White House has maintained that the two AI documents were created for different purposes but designed to be used side-by-side as initial voluntary guidance, noting that both OSTP and NIST were involved in the creation of both frameworks.

OSTP spokesperson Subhan Cheema said: “President Biden has been clear that companies have a fundamental responsibility to ensure their products are safe before they are released to the public, and that innovation must not come at the expense of people’s rights and safety. That’s why the administration has moved with urgency to advance responsible innovation that manage the risks posed by AI and seize its promise — including by securing voluntary commitments from seven leading AI companies that will help move us toward AI development that is more safe, secure, and trustworthy.”

“These commitments are a critical step forward and build on the administration’s Blueprint for an AI Bill of Rights and AI Risk Management Framework. The administration is also currently developing an executive order that will ensure the federal government is doing everything in its power to support responsible innovation and protect people’s rights and safety, and will also pursue bipartisan legislation to help America lead the way in responsible innovation,” Cheema added.

NIST did not respond to requests for comment.

The post Experts warn of ‘contradictions’ in Biden administration’s top AI policy documents appeared first on FedScoop.

]]>
72248
Federal IT executive Teresa Carlson takes leadership role with Flexport https://fedscoop.com/renowned-federal-it-executive-teresa-carlson-takes-executive-leadership-role-with-flexport/ Thu, 05 Jan 2023 22:34:52 +0000 https://fedscoop.com/renowned-federal-it-executive-teresa-carlson-takes-executive-leadership-role-with-flexport/ Carlson will lead the supply chain management and logistics firm's sales, marketing and communications, as well as other things.

The post Federal IT executive Teresa Carlson takes leadership role with Flexport appeared first on FedScoop.

]]>
Prominent federal technology executive Teresa Carlson has been appointed to the executive leadership team of multinational supply chain management and logistics firm Flexport as president and chief commercial officer.

At Flexport, Carlson will lead the company’s sales, marketing and communications operations.

On top of that, she will use her experience in the nonprofit space to head the company’s humanitarian aid and sustainability arm Flexport.org, which works with organizations to deliver aid and better meet their sustainability goals. According to a release, she’ll also will work with the company’s expansion into new global markets and verticals.

Carlson, who most recently served as corporate vice president and executive-in-residence under CEO Satya Nadella at Microsoft, will report to Flexport co-CEO Dave Clark.

“Teresa has an impressive track record of scaling businesses globally, and I have seen first-hand her dedication to delivering best-in-class technology solutions for customers around the world,” Clark said in a statement. “As Flexport looks to its next phase of growth, we believe Teresa’s leadership will help us forge new partnerships at a global scale and seize the incredible opportunity to digitally transform the supply chain for multiple industries.”

Carlson has been a mainstay in the federal IT landscape for more than two decades, first making a name for herself as vice president of federal sales and operations in an earlier, 10-year run with Microsoft. From there, she joined Amazon Web Services, building out the cloud service provider’s global public sector business during an 11-year tenure. She also took a role as chief growth officer for Splunk in 2021.

“Flexport has changed the way businesses view supply chain and logistics, and their technology-enabled platform has the power to make a huge impact for so many industries across the globe,” Carlson said. “I’m excited to join the talented Flexport team to grow the business globally and empower current and new customers with our full suite of innovative technology solutions.”

Carlson is a winner of multiple FedScoop 50 awards and an honoree on the publication’s 2018 Top Women in Tech list.

The post Federal IT executive Teresa Carlson takes leadership role with Flexport appeared first on FedScoop.

]]>
63722
AWS wins $724M contract providing Navy access to commercial cloud environment https://fedscoop.com/aws-wins-724m-contract-providing-navy-access-to-commercial-cloud-environment/ Sat, 24 Dec 2022 01:20:51 +0000 https://fedscoop.com/aws-wins-724m-contract-providing-navy-access-to-commercial-cloud-environment/ The contract will allow the Navy access to AWS' cloud environment.

The post AWS wins $724M contract providing Navy access to commercial cloud environment appeared first on FedScoop.

]]>
Amazon Web Services landed a $724 million contract to give the Navy more cloud tools.

“The Blanket Purchase Agreement (BPA) … will provide the Department of the Navy (DON) access to Amazon Web Services’ commercial cloud environment, which can process and store data that meets both Department of Defense (DOD) and DON information assurance policies,” Charlie Spirtos, a Navy spokesperson, said. “This collaboration with AWS will ensure that the Navy’s networks are modernized, secure, and capable of providing our Sailors and Marines with the enterprise network architecture required for mission success.”

A Dec. 19 contract announcement stated work on the contract will be performed for a maximum of five years and funds will be obligated as task orders are issued under a variety of funding types to include operation and maintenance, other procurement and working capital funds.

“We are proud to continue our support for the Department of the Navy and are committed to enabling their critical mission by delivering innovative, efficient, scalable, and secure cloud services,” Liz Martin, director of Defense Department business at AWS, said in a statement.

In recent policies issued by the Navy and DOD, mission owners must migrate from on-premise enterprise data products to commercial cloud environments with the assumption that such providers will be better equipped to provide them than the government.

A 2020 memo signed by the assistant secretary of the Navy for research, development and acquisition and CIO created a policy for the accelerated promotion and acquisition of cloud services.

The Navy’s information superiority vision mandates the service to modernize, innovate and defend as well as migrate applications to the cloud.  

The post AWS wins $724M contract providing Navy access to commercial cloud environment appeared first on FedScoop.

]]>
63705
Pentagon awards AWS, Google, Microsoft and Oracle spots on Joint Warfighting Cloud Capability solicitation https://fedscoop.com/pentagon-awards-aws-google-microsoft-and-oracle-spots-on-joint-warfighting-cloud-capability-solicitation/ Thu, 08 Dec 2022 03:43:45 +0000 https://fedscoop.com/pentagon-awards-aws-google-microsoft-and-oracle-spots-on-joint-warfighting-cloud-capability-solicitation/ JWCC is designed to replace the troubled Joint Enterprise Defense Infrastructure and serve as the Pentagon's enterprise cloud capability.

The post Pentagon awards AWS, Google, Microsoft and Oracle spots on Joint Warfighting Cloud Capability solicitation appeared first on FedScoop.

]]>
The Department of Defense awarded its highly anticipated enterprise cloud contract to Google, Oracle, Amazon Web Services and Microsoft, according to a contract notice published by the Pentagon Wednesday.

The four companies, which were initially all invited to compete for the Joint Warfighting Cloud Capability in November 2021, won’t be obligated any funds at the time of the award. Rather, the funds will be obligated on individual task orders that each company will compete for. The contract has a ceiling of $9 billion.

JWCC is designed to replace the maligned Joint Enterprise Defense Infrastructure (JEDI) and serve as the department’s enterprise cloud capability. It is being managed by the Defense Information Systems Agency.

Top officials have said it will be essential for future operations and enabling the Pentagon’s new way of fighting, dubbed Joint All-Domain Command and Control (JADC2).

“JADC2 … is utterly reliant on having an enterprise cloud capability that operates in all three security classifications: top secret, secret, [unclassified], from the continental United States all the way out to the tactical edge,” John Sherman, DOD chief information officer, said last month.

Officials have previously explained there will be access to unclassified capabilities once awarded. About 60 days after award there will be access to classified services, and no later than 180 days after award, there will be access to top-secret and tactical edge services.

In the interim, have moved out on their own cloud efforts as some have said they are not and cannot wait for JWCC to be awarded. These include the Navy’s Black Pearl, the Air Force’s Cloud One and the forthcoming Army Enterprise Application Migration and Modernization contract (EAMM), which will be a roughly $1 billion multi-award, multi-vendor effort.

Despite that, JWCC will still have its place and won’t be a redundant effort on top of these other cloud initiatives, according to officials leading its development.

“JWCC is meeting specific capability gaps in the areas of having all classification levels — so unclassified, secret and top secret, as well as tactical edge capabilities that work in those denied latency or communication-deprived environments, again, at all classification levels,” Sharon Woods, director of DISA’s Hosting and Compute Center, told reporters in November. “So JWCC will provide those capabilities and more at scale. The services have matured in their cloud journey and delivered high-quality capabilities. And we see them as being complementary, not in competition.”

DISA’s director said that they hope when some of these cloud efforts have run their course, the services will turn to JWCC.

The post Pentagon awards AWS, Google, Microsoft and Oracle spots on Joint Warfighting Cloud Capability solicitation appeared first on FedScoop.

]]>
63652
Inside cloud innovation stories in federal government https://fedscoop.com/inside-cloud-innovation-stories-federal-government/ Wed, 29 Jun 2022 19:42:47 +0000 https://fedscoop.com/?p=54687 Federal executives from U.S. agencies share their experience tapping into the power of cloud computing to enhance public services.

The post Inside cloud innovation stories in federal government appeared first on FedScoop.

]]>
The proof of the power of the cloud is in the enhanced services federal agencies can provide to the public. That’s according to federal leaders from nearly a dozen civilian agencies who joined FedScoop to talk about their success stories, and the challenges they overcame to integrate a growing array of cloud services.

The interview series, Cloud-Driven Innovation in Federal Government, underwritten by AWS, provided a platform for leaders to share their experience of tapping into the power of cloud computing.

Zach Goldstein, chief information officer for the National Oceanic and Atmospheric Administration (NOAA) says that cloud has allowed his agency to “speed up the innovative process, and “transformation research to operations.”

One initiative they undertook for their internal users is to provide them access to high-performance compute capabilities in the cloud versus using on-prem servers, which minimizes the wait time for their research teams to test their ideas.

Additionally, they are improving the efficacy of public access to hurricane tracking data from the National Weather Center, Goldstein explains.

“What we did was we placed in front of both our weather.gov set of websites, and the noaa.gov set of websites — which includes National Hurricane Center, among others — we’ve put them behind cloud-based content delivery networks,” he shares. Those sites can get up to a billion hits in 24 hours during a hurricane event, and the new infrastructure has greatly contributed to NOAA’s ability to handle that traffic.

Cloud migration is an important step to modernize aging infrastructure and improve operational productivity, shares AWS’ DOD Director, Liz Martin. Her team partnered with the U.S. Navy and SAP NS2 to migrate their largest SAP enterprise resource planning system — comprised of 72,000 users spread across six U.S. naval commands.

“The milestone, which actually came in 10 months ahead of schedule,…will put the movement of documentation of some $70 billion worth of parts and goods into one accessible space so the information can be shared, analyzed and protected more uniformly. And the reports that were [previously] being run from the legacy system, and took five to six hours to complete, are now taking about 30 minutes,” she explains.

But these moves to the cloud have not been without their challenges. Ron Thompson, chief data officer and deputy digital transformation officer at NASA shares that his agency needed to work out how to get users access to data, without also providing access to internal information or systems.

“The cloud enabled us to actually store all of this archive information in one place — for both internal and external to NASA users — and it really helped us focus those containers in one place versus having two,” he explains.

For many of the leaders, the exercise of moving to the cloud held some valuable lessons. Dwayne Spriggs, service delivery director at the Dept. of Justice (DOJ) was surprised by the amount of technical debt their organization had accumulated.

“We realized, in working with some of our end-users and customers, that they workarounds they put in place…because the system couldn’t keep up with their business requirements. They had manual processes on top of the automated processes they were using, and that wasn’t captured anywhere. It was just something that was ingrained into institutional knowledge.”

One thing most executives pointed out was that the skills gap to implement cloud modernization initiatives continues to be their primary challenge. That’s why AWS is commiting resources to help fill those gaps, shares Dave Levy, VP, U.S. Federal Government, Nonprofit and Healthcare at AWS.

“By 2025, AWS will help 29 million people globally grow their technical skills with free cloud computing skills training. We are investing hundreds of millions of dollars to provide free cloud computing skills training to people from all walks of life, and all levels of knowledge in more than 200 countries and territories around the world,” he shares.

Levy explains further that for AWS’ U.S. federal government partners they created an AWS government executive education program.

“It’s a four-day MBA style course for government leaders that shares insights from previous government transformations and digital transformation,” he says.

Other participants who shared their experiences in the video series include:

This video series was produced by Scoop News Group for FedScoop and sponsored by AWS.

The post Inside cloud innovation stories in federal government appeared first on FedScoop.

]]>
54687
Fresh Dragon Cloud deployments result in less heavy-lifting for soldiers https://fedscoop.com/fresh-dragon-cloud-deployments-result-in-less-heavy-lifting-for-soldiers/ Wed, 25 May 2022 17:52:56 +0000 https://fedscoop.com/?p=52728 "Minds were blown across the world" during a recent military cloud demonstration, DOD officials said.

The post Fresh Dragon Cloud deployments result in less heavy-lifting for soldiers appeared first on FedScoop.

]]>
With sights set on enabling data-centric warfare, “America’s contingency corps” is running workloads in the cloud that are unlike any the Army has run before.

Over the last few years, the XVIII Airborne Corps built Dragon Cloud, a unified data fabric that operates beyond the range of traditional kinetic warfare capabilities. Development didn’t come without challenges, military officials said Tuesday — but now, they’re working to ensure the cloud-based benefits they’ve achieved are repeatable across the Army and the Department of Defense writ large.

“When you’re a warfighter, you’re doing your daily job in the Army and you’re just grinding, you’re just getting after it. But you know there’s some part of your job that just absolutely sucks and there’s a better way to do it. So, that’s really kind of the starting point for the cloud efforts inside the XVIII Airborne Corps,” Dragon Cloud Program Manager Chief Warrant Officer 4 Brian Masters said at the AWS Summit in Washington.

When a natural disaster or manmade crisis occurs and the U.S. needs a rapid response force to step in, the XVIII Airborne Corps is prepared to deploy anywhere in the world. Headquartered in Fort Bragg, North Carolina, the corps has four major divisions and eight supporting brigades. 

The unit is made up of approximately 92,000 soldiers.

“That’s a large organization to try to make any kind of innovation and digital transformation change,” Masters noted.

Still, in 2020 the group kicked off a weighty effort to become the Army’s data-centric corps that fully applies digital capabilities to harness data as a strategic asset. 

To do so, officials partnered with Amazon Web Services and moved to launch what they deem to be the first enduring tactical cloud presence for the conventional Army — Dragon Cloud. Collaborating with AWS, Army Enterprise Cloud Management Agency, Army Analytics Group and others, the team developed a proof of concept (POC) cloud environment at Impact Level 5 (IL5), which hosts unclassified data and workloads compliant with government regulations.

Building on the success of that POC, the corps then went on to develop and refine an operational cloud environment that can process classified data and workloads at Impact Level 6 (IL6).

During the AWS conference panel, DOD officials repeatedly emphasized the importance of stakeholder buy-in when pursuing technology-centered innovation like this.

Chief Warrant Officer 3 Brian McDonell, senior technical advisor for the corps’ 101st Airborne Division, said one of the toughest elements of this effort involved pushing for a more risk-tolerant culture that allowed for deeper experimentation.

“A couple of Novembers ago, we had this environment all done, all built, and I was ready to test it. We had a month-long field exercise going on, and I said, ‘Okay, we’re going to go out and use all this cloud stuff.’ And I had a bunch of commanders say ‘I don’t want to use it,’ because they were scared. There was no trust in it, you know, this is the first time ever and nobody wants to fail. Even though we were just at our own home station, in our own backyard, and nobody was going to die or get fired if it failed, they were still terrified to use it,” McDonell said.

Despite that hesitation from some senior leaders early-on, McDonell said “the nail in the coffin” to get that stakeholder support was to show them the benefits through technology demonstrations, after convincing them it was worth the risk.

“I had a three-star general in Fort Bragg on his Army-issued computer, his unclassified computer. I gave him a URL, and he hit my cloud resource that I deployed, configured and maintained from Fort Campbell, Kentucky — and minds were blown across the world. It was just the craziest thing. Then we started exploring other opportunities. We started trying to connect to organizations over in Europe, and all that buy-in just started to steamroll really fast,” he said.

From there, officials opted to further expand cloud capabilities to host sensitive and secret workloads associated with mission command systems, and enabled them to operate within a globally accessible cloud environment.

“So I had my targeting system at Fort Campbell connected to and sharing data replicated to a target system on-prem over in Germany. And it was real time, no latency. It was great,” McDonell said. “And we’re still doing it today. We’re adding services now, still, to build that environment out.”

As a “server guy in the Army,” McDonell added that the service needs to get to a place where soldiers can stop literally carrying the network on their backs. Cloud-based applications help reduce the burden of all that hardware heavy-lifting. 

“SWaP — size, weight and power — is a metric that we typically use to kind of assess the viability of an option. We’re always trying to decrease our size, our weight and improve our power to make us more agile, more lethal and a better fighting force. By going to the cloud in this hybrid environment — and I say hybrid because we’re always going to have some kind of a hardware piece at the tactical edge for that warfighter — but in hosting everything else in the cloud we are experiencing 60% improvement in that [SWaP],” he said.

As this work continues to unfold, those involved are being deliberate about ensuring that the environment can be available worldwide and ultimately make lasting impacts across the Army and DOD.

“We want to try and build the playbook,” Maj. Bill King, division information systems officer in the XVIII Airborne Corps’ 3rd Infantry Division, said. “We’re going to tell you how to get there, and the best way once you get there to do it the fastest.”

Dragon Cloud marks one of multiple siloed cloud-driving efforts within the military that are evolving as the Pentagon prepares to make awards on its Joint Warfighting Cloud Capability (JWCC) contract and ultimately pave the way for interoperable, enterprise-wide cloud services.

The post Fresh Dragon Cloud deployments result in less heavy-lifting for soldiers appeared first on FedScoop.

]]>
52728
NSA re-awards $10B WildandStormy cloud computing contract to AWS https://fedscoop.com/nsa-re-awards-10b-wildandstormy-cloud-computing-contract-to-aws/ Wed, 27 Apr 2022 17:35:07 +0000 https://fedscoop.com/?p=51097 The award follows an earlier bid protest by Microsoft that was sustained last year by GAO.

The post NSA re-awards $10B WildandStormy cloud computing contract to AWS appeared first on FedScoop.

]]>
The National Security Agency has re-awarded its $10 billion WildandStormy cloud computing contract to Amazon Web Services.

The contract award comes after the Government Accountability Office in December last year found that the agency had improperly assessed technical proposals from Microsoft and recommended that it reassess the procurement.

In a statement, an NSA spokesperson said: “NSA recently awarded a contract to Amazon Web Services that delivers cloud computing services to support the Agency’s mission. This contract is a continuation of NSA’s Hybrid Compute Initiative to modernize and address the robust processing and analytical requirements of the Agency.”

They added: “The same cloud services were competed last year and the previously awarded contract was protested to the Government Accountability Office (GAO). The GAO sustained that protest in October 2021. Consistent with the decision in that case, the Agency has reevaluated the proposals and made a new best value decision.”

The NSA’s Hybrid Compute Initiative is understood to be a program run by the security agency to assess what sensitive national security data can be stored in commercial cloud infrastructure.

GAO in its prior bid dispute decision found that it was unreasonable for NSA to view the role of the Defense Information Systems Agency (DISA) as an approving authority gateway for top secret and unclassified services that would be provided under WildandStormy as a weakness for Microsoft.

According to the watchdog’s opinion, NSA objected to DISA’s role as an authorizing agency because it had concerns that services on Microsoft’s cloud environment may not be processed according to NSA priorities.

GAO at the time also found that NSA’s assessment of cloud network latency calculations unfairly benefitted Amazon Web Services. Despite sustaining this element of the protest, the watchdog denied Microsoft’s argument that NSA unreasonably evaluated offerors’ management proposals. It dismissed also Microsoft’s protest that the agency failed to evaluate price proposals on a common basis, saying that this was an untimely challenge.

An AWS spokesperson said: “We’re honored that after thorough review, the NSA selected AWS as the cloud provider for the Hybrid Compute Initiative, and we’re ready to help deliver this critical national security capability.”

The post NSA re-awards $10B WildandStormy cloud computing contract to AWS appeared first on FedScoop.

]]>
51097
AWS DOD exec on the military’s use of edge compute to enhance mission outcomes https://fedscoop.com/aws-dod-exec-on-the-militarys-use-of-edge-compute-to-enhance-mission-outcomes/ Mon, 07 Feb 2022 21:06:45 +0000 https://fedscoop.com/?p=47390 Cedric George on how military and defense agencies are using cutting-edge cloud technologies to increase speed to mission.

The post AWS DOD exec on the military’s use of edge compute to enhance mission outcomes appeared first on FedScoop.

]]>
Cedric George is Director for DOD Strategic Business Development at Amazon Web Services (AWS). In his role, he develops go-to-market strategies to deliver mission value to defense agencies. As a retired Air Force major general, he understands the value of building long-term trusted relationships with senior military and civilian executives to achieve common goals centered around mission readiness.

FedScoop caught up with George recently to get his current take on how defense agencies are using cutting-edge cloud technologies to increase speed to mission.

FedScoop: Edge computing has been an important capability for the Department of Defense. Could you share a recent example of how it has worked in practice?

edge compute

Cedric George, Director, DOD Strategic Business Development, AWS

George: Edge computing gives our warfighters access to innovative and secure solutions at the tactical edge – whether on land, in air or at sea. A good example of how it works took place during a technical demonstration known as “On-Ramp 4,” which was conducted to test edge computing capabilities for the Air Force’s Advanced Battle Management System (ABMS). AWS was proud to be invited to Germany last year to participate in the On-Ramp 4 exercise, organized by the Air Force Chief Architect Integration Office. The goal of the On-Ramp 4 forum was to serve as a roadmap, building off of prior on-ramps, to test the ability to integrate capabilities from vendors, U.S. military organizations and partner nations that work together to deploy a tactical edge network — or nodes — solution and leverages a highly resilient network of connections and communications systems.

The results demonstrated capabilities such as DevSecOps, the deployment of artificial intelligence and machine learning applications and Kubernetes clusters at the edge, and the ability to move deployed code from unclassified to classified networks.

FedScoop: What are some of the broader lessons that the defense community might take away from these on-ramp exercises?

George: In this case, the Air Force was able to show how it is possible to lay the foundations of an “Internet of Military Things” (IoMT), which is called out in the ABMS request for proposal and statement of work. In essence, it enabled data sharing across the Department of Defense, using established open digital architecture and standards. It also demonstrated the ability to transmit data over both classified and unclassified networks.

As Preston Dunlap, chief architect for the Air and Space Forces, told reporters, “It pushes the [integration] ball pretty dramatically forward.”

In fact, the Air Force recently expanded its ABMS on-ramp initiatives, testing capabilities to detect and defeat attacks on U.S. operations in space in addition to countering attempted cruise missile strikes against the homeland.

It is critical that the DOD continue to test these capabilities to improve how they work with commercial entities, like AWS.

FedScoop: From your perspective at AWS, how did these on-ramp exercise and utilizing cloud capabilities accelerate “speed to value” — or, in the military’s case, “speed to mission?”

George: These on-ramp exercises are all about achieving the one mission outcome most critical to combatant commanders: accelerate speed to mission need.

Through these exercises, the DOD has been able to demonstrate the ability to set up a scenario and test cloud-capabilities in a virtual environment while engaging multiple vendors, U.S. military organizations, and our partners to stress test what will happen in different scenarios. From there, each organization can take the lessons learned to adapt their tools or iterate relatively quickly.

The exercises also give our military leaders the ability to move faster and more effectively by experimenting using cloud capabilities. At the end of the day, it’s about delivering mission relevant capabilities to the field at the speed of mission needs. At AWS, we believe our DOD customers deserve access to the same AWS cloud technologies that are driving innovation across the private sector. We owe that to our service members and their families.

FedScoop: Where else are you seeing the cloud helping the Defense Department innovate faster?

George: Another good example came from the U.S. Navy. As crew members aboard the USS Theodore Roosevelt started becoming sick  at the beginning of the pandemic, the U.S. Navy deployed a COVID-19 health tracking application across select aircraft carriers and amphibious assault ships to collect data on sailors’ temperatures and symptoms. Using an edge service called AWS Snowball, they stood up the app within five weeks and ran it across eight ships, showing the agility and the speed of mission need we’re talking about achieving.

The readiness of the USS Theodore Roosevelt and its crew is vitally important to not only the U.S. Navy but for our nation. By ensuring the health of those sailors, the Navy was able to continue conducting their mission across the globe.

FedScoop: Where are you seeing the biggest opportunities for the Defense Department to innovate faster? And what should Defense officials keep their eye on as cloud and edge capabilities continue to evolve?

George: We are seeing progress and opportunity in the area of emerging technologies. Military leaders are really looking to take advantage of programs around artificial intelligence (AI) and machine learning (ML). And you see this appetite for modern technology with the examples I mentioned earlier.

At AWS, we believe that cloud can enables customers of all shapes and sizes — including governments, defense agencies and other public sector organizations — to scale up and down quickly and seamlessly and take advantage of innovative technology. From computer vision systems for autonomous driving, to FDA-approved medical imaging, AI is certainly driving public sector innovation.

One of the things we’ve been hearing people say is that we are in the “golden age of machine learning.” What I would say is the algorithms have been in existence for a while. But the military and the government are seeing that ML is becoming more accessible because organizations have access to very large datasets as well as lots of computational capabilities that weren’t previously available.

We’re really proud of the ability to work alongside our fantastic government customers and help unlock this potential.

At AWS, we feel strongly that our defense, intelligence and national security communities deserve access to the best technology in the world.

One truth that is not lost on me is that, long after you and I go to bed tonight — and I promise you long before we wake up in the morning — our service members will be accomplishing our difficult missions across the globe.

AWS is committed to the missions that protect our national interests, and we will not rest until we bring the best available cloud-enabling technology to our service members who serve in harm’s way.

Learn how AWS can help your agency capitalize on today’s cloud or contact us at AWS Public Sector.

 Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.

The post AWS DOD exec on the military’s use of edge compute to enhance mission outcomes appeared first on FedScoop.

]]>
47390