raj iyer Archives | FedScoop https://fedscoop.com/tag/raj-iyer/ FedScoop delivers up-to-the-minute breaking government tech news and is the government IT community's platform for education and collaboration through news, events, radio and TV. FedScoop engages top leaders from the White House, federal agencies, academia and the tech industry both online and in person to discuss ways technology can improve government, and to exchange best practices and identify how to achieve common goals. Wed, 08 Nov 2023 15:05:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.4 https://fedscoop.com/wp-content/uploads/sites/5/2023/01/cropped-fs_favicon-3.png?w=32 raj iyer Archives | FedScoop https://fedscoop.com/tag/raj-iyer/ 32 32 Experts warn of ‘contradictions’ in Biden administration’s top AI policy documents https://fedscoop.com/experts-warn-of-contradictions-in-biden-administrations-top-ai-policy-documents/ Wed, 23 Aug 2023 22:51:12 +0000 https://fedscoop.com/?p=72248 AI policy specialists say a lack of guidance from the White House on how to square divergent rights-based and risk-based approaches to AI is proving a challenge for companies working to create new products and safeguards.

The post Experts warn of ‘contradictions’ in Biden administration’s top AI policy documents appeared first on FedScoop.

]]>
The Biden administration’s cornerstone artificial intelligence policy documents, released in the past year, are inherently contradictory and provide confusing guidance for tech companies working to develop innovative products and the necessary safeguards around them, leading AI experts have warned.

Speaking with FedScoop, five AI policy experts said adhering to both the White House’s Blueprint for an AI ‘Bill of Rights’ and the AI Risk Management Framework (RMF), published by the National Institute of Standards and Technology, presents an obstacle for companies working to develop responsible AI products.

However, the White House and civil rights groups have pushed back on claims that the two voluntary AI safety frameworks send conflicting messages and have highlighted that they are a productive “starting point” in the absence of congressional action on AI. 

The two policy documents form the foundation of the Biden administration’s approach to regulating artificial intelligence. But for many months, there has been an active debate among AI experts regarding how helpful — or in some cases hindering — the Biden administration’s dual approach to AI policymaking has been.

The White House’s Blueprint for an AI ‘Bill of Rights’ was published last October. It takes a rights-based approach to AI, focusing on broad fundamental human rights as a starting point for the regulation of the technology. That was followed by the risk-based AI RMF in January, which set out to determine the scale and scope of risks related to concrete use cases and recognized threats to instill trustworthiness into the technology.

Speaking with FedScoop, Daniel Castro, a technology policy scholar and vice president at the Information Technology and Innovation Foundation (ITIF), noted that there are “big, major philosophical differences in the approach taken by the two Biden AI policy documents,” which are creating “different [and] at times adverse” outcomes for the industry.

“A lot of companies that want to move forward with AI guidelines and frameworks want to be doing the right thing but they really need more clarity. They will not invest in AI safety if it’s confusing or going to be a wasted effort or if instead of the NIST AI framework they’re pushed towards the AI blueprint,” Castro said.

Castro’s thoughts were echoed by Adam Thierer of the libertarian nonprofit R Street Institute who said that despite a sincere attempt to emphasize democratic values within AI tools, there are “serious issues” with the Biden administration’s handling of AI policy driven by tensions between the two key AI frameworks.

“The Biden administration is trying to see how far it can get away with using their bully pulpit and jawboning tactics to get companies and agencies to follow their AI policies, particularly with the blueprint,” Thierer, senior fellow on the Technology and Innovation team at R Street, told FedScoop.

Two industry sources who spoke with FedScoop but wished to remain anonymous said they felt pushed toward the White House’s AI blueprint over the NIST AI framework in certain instances during meetings regarding AI policymaking with the White House’s Office of Science and Technology (OSTP).

Rep. Frank Lucas, R-Okla., chair of the House Science, Space and Technology Committee, and House Oversight Chairman Rep. James Comer, R-Ky., have been highly critical of the White House blueprint as it compares to the NIST AI Risk Management Framework, expressing concern earlier this year that the blueprint sends “conflicting messages about U.S. federal AI policy.”

In a letter obtained exclusively by FedScoop, Arati Prabhakar responded to those concerns, arguing that “these documents are not contradictory” and highlighting how closely the White House and NIST are working together on future regulation of the technology.

At the same time, some industry AI experts say the way in which the two documents define AI clash with one another.

Nicole Foster, who leads global AI and machine learning policy at Amazon Web Services, said chief among the concerns with the documents are diverging definitions of the technology itself. She told FedScoop earlier this year that “there are some inconsistencies between the two documents for sure. I think just at a basic level they don’t even define things like AI in the same way.”

Foster’s thoughts were echoed by Raj Iyer, global head of public sector at cloud software provider ServiceNow and former CIO of the U.S. Army, who believes the two frameworks are a good starting point to get industry engaged in AI policymaking but that they lack clarity.

“I feel like the two frameworks are complementary. But there’s clearly some ambiguity and vagueness in terms of definition,” said Iyer.

“So what does the White House mean by automated systems? Is it autonomous systems? Is it automated decision-making? What is it? I think it’s very clear that they did that to kind of steer away from wanting to have a direct conversation on AI,” Iyer added.

Hodan Omaar, an AI and quantum research scholar working with Castro at ITIF, said the two documents appear to members of the tech industry as if they are on different tracks. According to Omaar, the divergence creates a risk that organizations will simply defer to either the “Bill of Rights” or the NIST RMF and ignore the other.

“There are two things the White House should be doing. First, it should better elucidate the ways the Blueprint should be used in conjunction with the RMF. And second, it should better engage with stakeholders to gather input on how the Blueprint can be improved and better implemented by organizations,” Omaar told FedScoop.

In addition to compatibility concerns about the two documents, experts have also raised concerns about the process followed by the White House to take industry feedback in creating the documents.

Speaking with FedScoop anonymously in order to speak freely, one industry association AI official said that listening sessions held by the Office of Science and Technology Policy were not productive.

“The Bill of Rights and the development of that, we have quite a bit of concern because businesses were not properly consulted throughout that process,” the association official said. 

The official added: “OSTP’s listening sessions were just not productive or helpful. We tried to actually provide input in ways in which businesses could help them through this process. Sadly, that’s just not what they wanted.”

The AI experts’ comments come as the Biden administration works to establish a regulatory framework that mitigates potential threats posed by the technology while supporting American AI innovation. Last month, the White House secured voluntary commitments from seven leading AI companies about how AI is used, and it is expected to issue a new executive order on AI safety in the coming weeks.

One of the contributors to the White House’s AI Blueprint sympathizes with concerns from industry leaders and AI experts regarding the confusion and complexity of the administration’s approach to AI policymaking. But it’s also an opportunity for companies seeking voluntary AI policymaking guidance to put more effort into asking themselves hard questions, he said.

“So I understand the concerns very much. And I feel the frustration. And I understand people just want clarity. But clarity will only come once you understand the implications, the broader values, discussion and the issues in the context of your own AI creations,” said Suresh Venkatasubramanian, a Brown University professor and former top official within the White House’s OSTP, where he helped co-author its Blueprint for an ‘AI Bill of Rights.’ 

“The goal is not to say: Do every single thing in these frameworks. It’s like, understand the issues, understand the values at play here. Understand the questions you need to be asking from the RMF and the Blueprint, and then make your own decisions,” said Venkatasubramanian.

On top of that, the White House Blueprint co-author wants those who criticize the documents’ perceived contradictions to be more specific in their complaints.

“Tell me a question in the NIST RMF that contradicts a broader goal in the White House blueprint — find one for me, or two or three. I’m not saying this because I think they don’t exist. I’m saying this because if you could come up with these examples, then we could think through what can we do about it?” he said.

Venkatasubramanian added that he feels the White House AI blueprint in particular has faced resistance from industry because “for the first time someone in a position of power came out and said: What about the people?” when it comes to tech innovation and regulations. 

Civil rights groups like the Electronic Privacy Information Center have also joined the greater discussion about AI regulations, pushing back on the notion that industry groups should play any significant role in the policymaking of a rights-based document created by the White House.

“I’m sorry that industry is upset that a policy document is not reflective of their incentives, which is just to make money and take people’s data and make whatever decisions they want to make more contracts. It’s a policy document, they don’t get to write it,” said Ben Winters, the senior counsel at EPIC, where he leads their work on AI and human rights.

Groups like EPIC and a number of others have called upon the Biden administration to take more aggressive steps to protect the public from the potential harms of AI.

“I actually don’t think that the Biden administration has taken a super aggressive role when trying to implement these two frameworks and policies that the administration has set forth. When it comes to using the frameworks for any use of AI within the government or federal contractors or recipients of federal funds, they’re not doing enough in terms of using their bully pulpit and applying pressure. I really don’t think they’re doing too much yet,” said Winters.

Meanwhile, the White House has maintained that the two AI documents were created for different purposes but designed to be used side-by-side as initial voluntary guidance, noting that both OSTP and NIST were involved in the creation of both frameworks.

OSTP spokesperson Subhan Cheema said: “President Biden has been clear that companies have a fundamental responsibility to ensure their products are safe before they are released to the public, and that innovation must not come at the expense of people’s rights and safety. That’s why the administration has moved with urgency to advance responsible innovation that manage the risks posed by AI and seize its promise — including by securing voluntary commitments from seven leading AI companies that will help move us toward AI development that is more safe, secure, and trustworthy.”

“These commitments are a critical step forward and build on the administration’s Blueprint for an AI Bill of Rights and AI Risk Management Framework. The administration is also currently developing an executive order that will ensure the federal government is doing everything in its power to support responsible innovation and protect people’s rights and safety, and will also pursue bipartisan legislation to help America lead the way in responsible innovation,” Cheema added.

NIST did not respond to requests for comment.

The post Experts warn of ‘contradictions’ in Biden administration’s top AI policy documents appeared first on FedScoop.

]]>
72248
When automation out-delivers IT modernization https://fedscoop.com/when-automation-out-delivers-it-modernization/ Wed, 15 Feb 2023 18:30:00 +0000 https://fedscoop.com/?p=65872 Government leaders report automation has fast-tracked large-scale service improvements faster and at lower costs than big-ticket IT modernization projects.

The post When automation out-delivers IT modernization appeared first on FedScoop.

]]>
Government leaders from a growing roster of federal and state agencies are realizing significant benefits from enterprise automation to drive business transformation without having to endure the lumbering pace and high cost of IT modernization projects, according to a new report.

“Automation enables [the U.S.] Army to create new capabilities in legacy systems without investing resources into changing the underlying system,” said Raj G. Iyer, former CIO of the U.S. Department of the Army. Iyer was one of several government officials cited in the report, produced by Scoop News Group for FedScoop and sponsored by UiPath, who detailed how automation is making a significant difference in their organization.

Iyer, who stepped down from his position at the end of last month, explained how the Assistant Secretary of the Army (Financial Management and Comptroller) office recently completed a pilot program where robotic process automation (RPA) expedited the handling of unmatched financial transactions. The ASA (FM&C) office handles more than one million such transactions per year, according to Iyer. “RPA is expected to save millions of dollars in manual labor each year,” he said.

Read the full report.

“I think that automation — specifically using bots — is really starting to take off and provide value to businesses,” added Krista Kinnard, chief of emerging technology in the CIO’s office at the U.S. Department of Labor.

Kinnard and others explained that automation isn’t just speeding up workflows but boosting productivity and improving agency services faster, at lower costs and with less risk than big-ticket IT modernization projects.

“Automation is moving from the edges, all the way inside into the enterprise. That’s a big change,” observed Sunil Madhugiri, chief technology officer at U.S. Customs and Border Protection, where approximately 250 automation “bots” are in production or under development, according to the report. Madhugiri highlighted one instance where automation helped CBP work with international airlines to notify and divert some 239,000 travelers from boarding U.S.-bound flights due to Covid restriction rules during the pandemic.

The report highlights how automation can effectively “operationalize” mission and business processes at federal agencies and deliver cost savings and service improvements that often prove elusive in IT modernization overhauls.

“Modernization has become synonymous with big, ‘rip-and-replace’ efforts, involving new systems, long-term physical transformations that are costly in technology, change management, workforce, opportunity cost, and time to value,” noted Mike Daniels, senior vice president, public sector at UiPath. “[Government agencies] have made huge investments to forklift systems to the cloud. But what’s gotten lost in that process is the need to examine whether those efforts drive a result quicker, faster or better.”

Todd Schroeder, a former chief of digital services at the U.S. Department of Agriculture who now serves at UiPath as public sector vice president, adds that automation platforms not only bring the power of scale to the work agency employees need to get done but can also address process pain points quickly. He cites in the report how the New York Department of Labor, despite a 10-fold increase in temporary staff, couldn’t keep up with demand for unemployment claims during the pandemic — and how deploying UiPath tools not only cut through the backlog but later helped save New York an estimated $12 billion in potential fraud.

Read the full report on how automation is helping government agencies improve mission services.

This article was produced by Scoop News Group for FedScoop and sponsored by UiPath.

The post When automation out-delivers IT modernization appeared first on FedScoop.

]]>
65872
Government leaders tout big wins for their missions with AI, ML and cloud tools https://fedscoop.com/government-leaders-tout-big-wins-for-mission-ai-ml-cloud/ Wed, 14 Dec 2022 01:30:00 +0000 https://fedscoop.com/government-leaders-tout-big-wins-for-mission-ai-ml-cloud/ Executives from the U.S. Army, U.S. Postal Service and the State of New York highlight IT modernization initiatives at Google Government Summit.

The post Government leaders tout big wins for their missions with AI, ML and cloud tools appeared first on FedScoop.

]]>
Public sector organizations are making big strides supporting their missions by applying artificial intelligence, machine learning, analytics, security and collaboration tools to their initiatives.

That’s according to government executives from the U.S. Army, U.S. Postal Service and the State of New York who joined Google leaders on stage for the opening keynote at the Google Government Summit in Washington, D.C. on November 15.

From both a warfighter perspective and a user experience perspective, the U.S. Army “needs data for decision-making at the point of “need” with the “the right tools to get the job done” across a diverse set of working conditions, explained Dr. Raj Iyer, Army CIO for the U.S. Department of the Army.

During the event, Dr. Iyer shared that Google Workspace will be provisioned for 250,000 soldiers working in the U.S. Army. The first 160,000 users have migrated to Google Workspace in just two weeks – with plans for the remaining personnel to be up and running by mid-2023. Google Workspace was designed to be deployed quickly to soldiers across a variety of locations, jobs and skill levels.

Thomas Kurian, CEO for Google Cloud, also took the stage and expressed Google’s “deep commitment” to providing products and solutions that are mature, compliant and meet government’s mission goals.

“In the last four years, we’ve really heightened our work for the government…in the breadth of our products that are focused as solutions, and significantly ramped up our compliance certifications to serve agencies more fully. And we culminated that by launching Google Public Sector, the only division that Google has in the whole company dedicated to a single industry,” Kurian explained.

Though cloud was once mainly viewed as a solution that can mainly provide economic elastic compute, what makes Google Cloud competitive against other providers is its ability to offer solutions for different needs as the nature of cloud computing evolves, said Kurian.

“Organizations want to get smarter to make decisions, combining both structured and unstructured data. And they want to be able to do analysis no matter where the data sits — whether it’s in our cloud or other clouds. We are the only cloud that lets you link data and analyze it across multiple clouds, structured and unstructured, without moving a single piece of data.”

Cybersecurity was also a key concern raised during the keynote, namely the need to simplify security analysis tools so cyber experts can detect threats faster.

“Protecting governments isn’t just for something for extraordinary times. The business of government requires constant vigilance,” said Karen Dahut, CEO for Google Public Sector, the company’s independent division, focused solely on the needs of federal, state and local government and the education sector.

She cited the success of the New York City Cyber Command, which works across city government to detect and prevent cyber threats. They are accomplishing this “by building a highly secure and scalable data pipeline on Google Cloud so their cyber security experts can detect threats faster.”

Google has also recently strengthened its ability to help its customers access data on known threats with the recent acquisition of Mandiant. Kevin Mandia, CEO and director for Mandiant, now a part of Google Cloud, took the stage to explain how the company has been uniquely positioned to “own that moment of incident response” and threat attribution. This has given the company an immense collection of data on cyber incidents and intrusion techniques.

“When Mandiant and Google combined,” he explained, “we took the security DNA of Mandiant…and joining — what I believe is the best AI on the planet, best machine learning on the planet, best big data on the planet — and we’re bringing what we know [about cybersecurity] to scale.”

The keynote featured several seasoned technology leaders who each shared how cloud, artificial intelligence and machine learning tools are helping their agencies achieve mission outcomes and keep pace with cybersecurity needs, including:

  • Pritha Mehra, CIO and Executive VP, United States Postal Service
  • Rajiv Rao, CTO and Deputy CIO, New York State
  • Teddra Burgess, Managing Director, Federal Civilian, Google Public Sector
  • Leigh Palmer, VP, Delivery and Operations, Google Public Sector

Watch the keynote in its entirety on the Government Summit On-Demand page. This article was produced by Scoop News Group for FedScoop and underwritten by Google Cloud.

The post Government leaders tout big wins for their missions with AI, ML and cloud tools appeared first on FedScoop.

]]>
63672
TMF invests more than $20.8M in 3 agencies’ cyber and CX projects https://fedscoop.com/tmf-cyber-cx-projects-october/ Fri, 07 Oct 2022 00:11:34 +0000 https://fedscoop.com/?p=61402 OPM, HUD and the Army are the latest recipients of funding that will modernize IT systems and move them toward zero-trust security architectures.

The post TMF invests more than $20.8M in 3 agencies’ cyber and CX projects appeared first on FedScoop.

]]>
An Office of Personnel Management effort to simplify its website was among three agencies’ projects receiving more than $20.8 million in the latest round of Technology Modernization Fund investments announced Thursday.

The agency plans to improve the user experience of OPM.gov for 22 million annual visitors with the $6 million it received, while the Department of Housing and Urban Development was awarded $14.8 million for IT modernization and the Army an undisclosed amount to develop a Security Operations Center-as-a-Service (SOCaaS) framework.

More than 150 proposals from 70 agencies for $2.8 billion in funding have come in since the American Rescue Plan Act infused $1 billion into the TMF. The TMF Board in June said $100 million of those funds would be dedicated to improving customer experience across public-facing digital services and systems and addressing immediate service gaps starting in fiscal 2023.

“With these new investments, the TMF now has a portfolio of 32 investments totaling over half-a-billion dollars,” said TMF Executive Director Raylene Yung in the announcement. “But beyond funding, we’re working closely with our partners to provide them with expertise and support while also empowering their own teams to execute these projects in a way that ensures maximum benefits for the public.”

OPM will update and secure its website’s content management system in the cloud to more clearly communicate the services and benefits it offers and career information, thereby improving federal hiring, said Director Kiran Ahuja.

HUD intends to implement a cloud platform integrating legacy Federal Housing Administration Connection systems with Login.gov for identity, credential and access management allowing more than 95,000 users to self-register.

The Army’s SOCaaS project will bolster network monitoring for cybersecurity threats at 26 Organic Industrial Base (OIB) sites — depots, arsenals and ammunition plants — worth $8.5 billion.

“Funding from the TMF will help the Army to address this critical and urgent need as much as two years earlier than waiting for agency funding to become available through the regular budgeting process,” said Chief Information Officer Raj Iyer in a statement. “For the Army and its partners, this project will modernize and improve our cybersecurity posture and enhance the command and control of critical OIB.”

The post TMF invests more than $20.8M in 3 agencies’ cyber and CX projects appeared first on FedScoop.

]]>
61402
For resiliency, the Army may look to rely more on commercial systems than SIPRNet, NIPRNet https://fedscoop.com/for-resiliency-the-army-may-look-to-rely-more-on-commercial-systems-than-siprnet-niprnet/ Wed, 15 Jun 2022 20:04:43 +0000 https://fedscoop.com/?p=53734 In order to be more resilient in the fact of adversary disruptions, the Army may rely more on commercial solutions as opposed to its current network configurations like SIPRNet and NIPRNet, its CIO said.

The post For resiliency, the Army may look to rely more on commercial systems than SIPRNet, NIPRNet appeared first on FedScoop.

]]>
The Army’s top IT official on Wednesday questioned the utility of the service’s current classified and unclassified network configurations and instead pointed to the possibility of relying on commercial systems that could be more resilient in future conflicts against sophisticated adversaries.

Adversaries will contest U.S. forces unlike ever before, straining the network and making it harder for data to be passed back and forth and accessed at the right time, said Army CIO Raj Iyer. As a result, forces must be more adaptable and take advantage of various means for communication and transport, such as commercial solutions.

“Our strategy again here is to get to greater resiliency, with commercial transport, using dark fiber, a heck of a lot more encryption when it comes to secret … The need for us to have … physical separation of data and networks for SIPR, or SIPR to ride on NIPR, those days are gone,” Iyer said during a presentation hosted by GovConWire

“It really questions what do we need a SIPRNet for? Why do we need a whole separate network, that we can actually do pretty damn well with encryption.”

Raj Iyer, Army CIO

Iyer was referencing the SIPRNet — or Secure Internet Protocol Router Network, which is the Pentagon’s network to handle secret classified information — and NIPRNet —the Non-classified Internet Protocol Router Network, which handles unclassified information.

“What we have been able to show if you have the right encryption in place that’s quantum-resistant and we were able to use solutions like commercial solutions for classified, and we have shown that today … and validated that. It really questions what do we need a SIPRNet for? Why do we need a whole separate network, that we can actually do pretty damn well with encryption,” he said. “Then absolutely the same question on NIPRNet. If we move all of our data and applications to the cloud and if I can get to a virtual desktop in the cloud and I can use any open available internet to be able to access all of that through any device, then what do we really need the NIPRNet for?”

These questions arise as the Army is developing its unified network plan — part of its larger digital transformation strategy — which aims to synchronize and connect the service’s enterprise and tactical network together.

Currently, silos exist between the two, creating barriers for troops who want to pass data across echelons or even theaters. This especially creates problems when troops move from one theater to another, as seen most recently in Afghanistan.

“I saw forces come into the theater that were not able to join the network right away. It was really, really cumbersome for everything that we needed to do while I was there,” Brig. Gen. Jeth Rey, director for the Army Network-Cross Functional Team, said in October.

For Iyer, the Army needs to question the status quo to evolve and succeed in future battlefield environments.

“We’re thinking out of the box. I’m not saying you have all the solutions, but we really going back to the direction I have from my boss, this is how we’re going to transform,” Iyer said about the Army’s modernization approach and potential for using more commercial solutions in an attempt to be more resilient from adversary disruptions.

He added that if the Army doesn’t question the status quo, it will be limited by aging technologies and architectures from the past.

One such example from the Ukraine-Russia conflict Iyer and others have pointed to is SpaceX’s Starlink satellite constellation that provides internet coverage.

Despite Russian attempts to jam the system in Ukraine, the following day, Starlink reported adding new lines of code that rendered the jamming ineffective.

“We saw how Starlink is actually tremendously helping establish a communications network in an environment that we thought would be degraded on day one,” Iyer said in April.

Army forces must be able to communicate and pass data in denied and degraded environments in the future.

“As we get into more of a distributed command and control structure, what we really don’t want is a massive command post that has all of this IT in one place, where we become bullseye for our enemies,” he said Wednesday. “Moving to the distributed C2 means that we’re going to have to leave data in multiple places with greater resiliency, we’re going to have to rely on all kinds of transport, not just MILSATCOM, but commercial SATCOM, as well and this is where the example I gave you with Starlink and how we’re using that today in Europe is a great example. All of this coupled with … compute at the edge is going to be absolutely critical in terms of supporting tactical operations.”

The post For resiliency, the Army may look to rely more on commercial systems than SIPRNet, NIPRNet appeared first on FedScoop.

]]>
53734
Project to converge Army enterprise business systems will be ‘marquee effort’ for CIO https://fedscoop.com/project-to-converge-army-enterprise-business-systems-will-be-marquee-effort-for-cio/ Fri, 10 Jun 2022 18:13:33 +0000 https://fedscoop.com/?p=53544 The Army plans to award contracts to industry for IT that could help the department converge its many enterprise business systems.

The post Project to converge Army enterprise business systems will be ‘marquee effort’ for CIO appeared first on FedScoop.

]]>
The Army plans to award multiple contracts to industry for IT prototypes that could help the department modernize and integrate its many enterprise business systems.

Overall, the service plans to invest about $1.4 billion in its business systems in fiscal 2023.

“Our marquee effort in ‘23, though, is going to be our implementation or initial prototyping for our new enterprise business systems convergence. And this is our modernization of our business systems across the Army to essentially modernize 20-year-old legacy ERP [enterprise resource planning] systems,” Army CIO Raj Iyer told reporters Thursday.

“In order to support contested logistics in the future … we need the best technology and the best business systems to be able to provide us, you know, both financial management and logistic support to the field,” he said.

The portfolio the Army is looking at upgrading currently includes five ERPs and about 150 other non-ERP legacy systems.

“We’re trying to converge them into a single architecture, into a single system if we can, to the best extent possible so that we have one integrated capability” that can pass data across the spectrum of operations for analytics and other tasks, he explained.

The Army will use other transaction agreements (OTAs) for the acquisition. OTAs are intended to cut through bureaucratic red tape and facilitate faster prototyping of new technologies and follow-on production contracts.

The plan is for the program executive office for enterprise information systems to hold an industry day and release a request for white papers this summer.

The Army expects to award multiple OTAs for prototyping efforts in early fiscal 2023.

“These would run anywhere from 12 to 18 months. And then at the end of that effort … we will get to a production contract by down selecting, you know, one of those prototypes to be our production solution,” Iyer said.

Some of the things the Army will be looking at include how modular and “future proof” a proposed architecture would be, as well as its ability to support data exchange through application programming interfaces and microservices.

“This again will help us make sure that interoperability with other systems. And to be able to use that data seamlessly across other systems and analytics is a huge driver. We’ll be looking at the system being cloud-native from the get-go and making sure that we can fully benefit from a true modern architecture. And then … we’ll be looking at how flexible the solution will be in terms of its ability to implement Army-unique processes wherever we have them without the need to customize commercial off-the-shelf products,” he said.

The Army seeks agility, and it will be pushing industry to use DevSecOps for new capability increments.

“We are looking for functionality to be available or released to users … on, you know, rapid sprints” every two-to-six months, Iyer said. “This is all about getting functionality in the hands of the user rapidly through agile development.”

However, the overall modernization effort will be “massive” and could take up to 10 years to implement, he said.

“We’re not going with the big bang approach,” Iyer said. “We will let the functional priorities define … what those increments will be, and then we will look at the risk profile to look at how quickly we can get those turned on. And that will determine the level of funding and the timeline for implementation.”

The post Project to converge Army enterprise business systems will be ‘marquee effort’ for CIO appeared first on FedScoop.

]]>
53544
Army prioritizing data as it revamps for large-scale ops at the division level https://fedscoop.com/army-prioritizing-data-as-it-revamps-for-large-scale-ops-at-the-division-level/ Wed, 18 May 2022 20:09:50 +0000 https://fedscoop.com/?p=52406 Army leaders are looking at how to reduce complexity and ensure the right data gets to the right personnel at the time of need.

The post Army prioritizing data as it revamps for large-scale ops at the division level appeared first on FedScoop.

]]>
As the Army is honing in on large-scale operations, shifting its main unit of action from brigade to division, and aligning its priorities with joint efforts to connect sensors and shooters, leaders believe data will be the key driver.

The service’s next capability build as it incrementally modernizes its network will be focused on data and connectivity needs at the division level and other supporting elements.

The Army has adopted a multiyear strategy involving the incremental development and delivery of new capabilities to its integrated tactical network, involving a combination of program-of-record systems and commercial off-the-shelf tools. Those “capability sets” now provide technologies to units every two years, each building upon the previous delivery.

In between these deliveries, the Army has hosted several technical exchange meetings to gather members of industry, the Army acquisition community, Army Futures Command and the operational community to outline priorities and capabilities to modernize the service’s tactical network.

The most recent, in Philadelphia, was the first to focus on division of action and capabilities needed to support that.

“That future battlefield means that we’re going to have to truly take advantage of data at a scale and at a speed” that the service isn’t used to, Raj Iyer, the Army’s chief information officer, said at the meeting May 9. “It also means that our fight, God forbid, in a large-scale combat operation, is going to be at the division or the corps level, not at the brigade combat team level … That’s a big change for the Army.”

The service says it’s moving from a network-centric approach to a data-centric approach, which prioritizes the flow of information globally.  

For Army leaders, the first step is allowing data to flow more freely.

“The first thing we have to do is learn how we can get complexity off the transport and allow data to move freely through it,” Brig. Gen. Jeth Rey, director of the Army’s network cross-functional team, said at the conference.

But this problem also requires the Army to determine what data and how much of it resides at what echelon, which the Army is experimenting with.

In order to really achieve the data-centric model, as well as the Pentagon’s Joint All-Domain Command and Control (JADC2) concept for better connecting sensors and shooters, a true data fabric will be essential.

“In order to achieve data centricity, we have to have those cloud and systems around the world,” Rey said. “I believe in order to achieve JADC2 and sensor to shooter, a data fabric is going to be required.”

A data fabric is not a single solution, but rather, a federated environment that allows information-sharing among various forces and echelons.

This vision was solidified for Rey at a Project Convergence experiment last year. Project Convergence is an annual Army experiment that tests technologies and concepts associated with the Pentagon’s JADC2 effort.

“Coming out of PC ’21 is where we really saw the requirement for a data fabric,” Rey told reporters on the sidelines of the conference. “You’ve heard us talk sensor to shooter, sensor to shooter, but the only thing that really allowed us to pull those sensors in and make sense of the sensor information was by having a data fabric. That’s why we really focus this to this [technical exchange meeting] on data, because that’s where we needed to talk.”

Rey called a data fabric the “underpinning” of what’s needed to visualize what’s happening on the battlefield with sensors.

This will be critical to achieving not only JADC2 goals, but also realizing what some officials call the “kill chain of the future.”

Next year’s Project Convergence will bring coalition partners into the fold for the first time. Military officials always stress that the U.S. does not fight alone and thus, and it needs to be able to pass data back and forth to coalition partners so the optimal asset can take action.

“It’s not only just data, we’re talking about sensor data off of our mission systems to ensure we can share it not only with our joint partners, but also share it with our coalition partners,” Rey said. “Can they tie into our Patriot missiles and then share the data and then the best system shoot? That’s what we’re going to try to find out about the integration piece of data.”

Officials said they want the best assets in the best areas to be able to take action.

That means leveraging “the right sensor through the right command-and-control node, getting to the right shooter, to be able to facilitate that kill chain and you’re getting information to decision-makers,” Maj. Gen. Robert Collins, program executive officer for command, control, communications-tactical, told reporters.

The post Army prioritizing data as it revamps for large-scale ops at the division level appeared first on FedScoop.

]]>
52406
Army CISO Easley departs for role at the Pentagon https://fedscoop.com/army-ciso-easley-departs-for-role-in-pentagon/ Tue, 22 Feb 2022 20:50:53 +0000 https://fedscoop.com/?p=47887 Maj. Gen. Matt Easley has taken a role as the deputy principal information operations advisor to the secretary of defense.

The post Army CISO Easley departs for role at the Pentagon appeared first on FedScoop.

]]>
Maj. Gen. Matt Easley has stepped down from his role as director of cybersecurity and chief information security officer for the Army.

Easley has taken a role as the deputy principal information operations adviser to the secretary of Defense within the office of the undersecretary for policy.

The fiscal 2020 National Defense Authorization Act created the role of principal information operations adviser, which since then has been fulfilled by the undersecretary of Defense for policy, currently Colin Kahl. As such, Easley will report through Kahl to Secretary of Defense Lloyd Austin on matters of information operations.

Easley served as the Army’s director of cybersecurity since August 2020, shortly after the service’s Office of the CIO was restructured. Before that, he was director of the Army’s Artificial Intelligence Task Force for roughly two years.

In a LinkedIn post, Army CIO Raj Iyer wrote of his departed colleague: “A bittersweet moment for us in the Army Chief Information Officer family as we bid farewell to Major General Matt Easley, our first Chief Cybersecurity Officer in the new OCIO. Matt has set us up on an awesome path to zero trust for both IT and Operational Technology. The Army’s FISMA scorecard under his leadership is the best across the entire DoD. Thank you Matt for everything you have done for us and best wishes on a very critical role in the DoD moving forward.”

Easley was recognized last fall as one of the 2021 CyberScoop 50 in the government category.

The Army did not comment on Easley’s departure.

The post Army CISO Easley departs for role at the Pentagon appeared first on FedScoop.

]]>
47887
After calls to ‘fix our computers,’ military CIOs pledge to get it right https://fedscoop.com/after-calls-to-fix-our-computers-military-cios-pledge-to-get-it-right/ Fri, 04 Feb 2022 18:57:29 +0000 https://fedscoop.com/?p=47329 Top CIOs say they "have taken the dialogue around the need to 'fix our computers' at DoD to heart."

The post After calls to ‘fix our computers,’ military CIOs pledge to get it right appeared first on FedScoop.

]]>
CIOs across the Department of Defense have heeded the widely circulated complaints of defense personnel, who took to social media platforms in recent weeks calling on senior leaders to “fix our computers.”

“The DoD and Military Department CIOs have taken the dialogue around the need to ‘fix our computers’ at DoD to heart,” reads a brief note posted on LinkedIn Friday, signed jointly by DOD CIO John Sherman and Kelly Fletcher, his acting principal deputy CIO, as well as Air Force CIO Lauren Knausenberger, Army CIO Raj Iyer and Navy CIO Aaron Weis.

“We know there is a lot of work to do to make your user experience better, increase our #cybersecurity, and enable modern office productivity and analytical capabilities. We definitely haven’t been standing still on this point, however, and ensuring we deploy increasingly improving capabilities for you— the folks getting the work done every day in the Department— is our priority,” the note adds.

It comes a week after Michael Kanaan, director of operations for the U.S. Air Force and MIT Artificial Intelligence Accelerator, posted to LinkedIn a spirited open letter pleading with DOD leadership to “fix our computers.”

“Yesterday, I spent an hour waiting just to log-on. Fix our computers,” Kanaan wrote. “I Googled how much the computer under my desk costs in the real-world. It was $108 dollars. Would you ever buy a $100 dollar computer? Fix our computers.”

Kanaan makes the point in his letter that some of the most menial tasks, like sending an email, take more than an hour rather than minutes. And because of that, work hours that could go to innovation inside “the richest and most well-funded military in the world” are instead spent waiting on basic IT functions to work.

“Want innovation?” he wrote. “You lost literally HUNDREDS OF THOUSANDS of employee hours last year because computers don’t work. Fix our computers.”

Kanaan’s post caught the attention of many, including some senior DOD IT officials who responded with their thoughts directly with comments on the post. As of publication, it spurred 369 comments and 2,285 “reactions.”

Since that initial post, other DOD tech officials have piled into the discussion, including Jason Weiss, DOD’s chief software officer, who wrote, citing his personal opinion: “Just one day I want to be productive at work and give my best for the warfighter without sitting here watching Outlook and Teams fight to see who can crash more often and render me useless in my humble pursuit of productivity.”

Likewise, Artem Sherbinin, a navigation officer in the Navy, wrote a similar post calling for senior leaders to not only fix the computer hardware but to “fix our software too,” adding that service members on the frontlines must be part of that solution.

The defense CIOs said in their post that work is underway to better enable secure telework, provide personnel with “new, higher-performing laptops,” and decrease network latency.

To this, Kanaan wrote on Twitter: “Responding directly to the needs of US servicemembers & employees is genuine progress in the right direction. I know we are all glad to see this conversation getting the attention it deserves, and appreciate it deeply.”

The post After calls to ‘fix our computers,’ military CIOs pledge to get it right appeared first on FedScoop.

]]>
47329
Army working to deploy first OCONUS cloud system in the Indo-Pacific https://fedscoop.com/army-working-to-deploy-first-oconus-cloud-system-in-the-indo-pacific/ Mon, 24 Jan 2022 18:56:45 +0000 https://fedscoop.com/?p=46703 The Army is building a hybrid cloud that uses both on-premise data centers and commercial cloud services to operate outside the continental U.S, or OCONUS, said CIO Raj Iyer.

The post Army working to deploy first OCONUS cloud system in the Indo-Pacific appeared first on FedScoop.

]]>
The Army is in the beginning stages of building out its first tactical cloud system for its forces in the Pacific this year.

The system will be the Army’s first hybrid cloud that uses both on-premise data centers and commercial cloud services to operate outside the continental U.S, or OCONUS, the Army’s top IT official Raj Iyer said. The new hybrid capability should increase the Army’s ability to store and process data in the command, a resource that is in high demand across the military as it works to modernize.

“It allows us to integrate cloud into all aspects of experimentation,” Iyer said recently during AFCEA NOVA’s Army IT day.

Army spokesman Bruce Anderson provided more detail on the status of the system, saying Army Pacific is “analyzing information exchange, system, and service requirements to determine the optimal locations for cloud-hosted capabilities.”

The Army plans to run a “series of exercises, experimentations, and basic application analysis” through fiscal 2023 on its journey for getting cloud to the edge in the Pacific, he said.

OCONUS cloud is a new top priority for the Department of Defense after publishing its first strategy to get cloud tech to the “tactical edge” in May. The document, signed by the DOD CIO, calls for DOD to negotiate with foreign partners to get more cloud tech into bases overseas. The strategy highlights the importance of many senior leaders refer to as the “tactical edge” — getting data processed closer to where it is being generated and decisions need to be made to increase the speed of operations.

“We are well on our way to actually implementing the first OCONUS cloud in the Indo-Pacific,” Iyer said.

Despite the Army’s efforts to shutter data centers, it is planning to use them in the hybrid cloud architecture in the Pacific, Iyer said. The Army is “integrating our commercial compute and store in the cloud with our on-premise resources that we have in our data centers,” he explained.

So far no contracts or task orders have been signed with cloud service providers that work with the DOD for this effort, Anderson told FedScoop. The goal is to use cloud to extend the services already available with on-premise data centers in theater and eventually be able to provide a corps-level common operating picture.

“We are currently developing the operational requirements for the cloud … We are also working with commercial service providers to understand how they may be able to meet Army requirements,” Anderson said.

The post Army working to deploy first OCONUS cloud system in the Indo-Pacific appeared first on FedScoop.

]]>
46703