{"id":96269,"date":"2026-04-17T15:45:26","date_gmt":"2026-04-17T12:45:26","guid":{"rendered":"https:\/\/u1f987.com\/en\/?p=96269"},"modified":"2026-04-17T15:50:25","modified_gmt":"2026-04-17T12:50:25","slug":"google-and-pentagon-discuss-integration-of-gemini","status":"publish","type":"post","link":"https:\/\/u1f987.com\/en\/google-and-pentagon-discuss-integration-of-gemini\/","title":{"rendered":"Google and Pentagon Discuss Integration of Gemini"},"content":{"rendered":"<p>Google is in discussions with the U.S. Department of Defense about integrating Gemini into Pentagon systems across all levels of information access, from open to strictly classified. This was reported by <a href=\"https:\/\/www.theinformation.com\/articles\/google-pentagon-discuss-classified-ai-deal-company-rebuilds-military-ties\">The Information<\/a>, citing sources.<\/p>\n<p>The agreement would allow the Department of Defense to use Google&#8217;s development &#8220;for all lawful purposes.&#8221;<\/p>\n<p>During the negotiations, Google proposed additional provisions to prevent the use of Gemini for mass surveillance of citizens or the creation of autonomous weapons.<\/p>\n<p>A Pentagon representative declined to comment directly on the dialogue with Google. However, he emphasized that the department plans to continue implementing advanced AI technologies through close collaboration with the private sector at all levels of secrecy.<\/p>\n<h2 class=\"wp-block-heading\">Anthropic Controversy<\/h2>\n<p>A recent conflict between the Pentagon and AI startup Anthropic arose on this basis. It escalated into a legal dispute and <a href=\"https:\/\/u1f987.com\/en\/news\/us-government-to-cease-use-of-anthropic-technologies\">a ban<\/a> by U.S. President Donald Trump on the use of the company&#8217;s technologies in all federal agencies.<\/p>\n<p>The groundwork for this was laid in July 2025. At that time, the U.S. Department of Defense <a href=\"https:\/\/u1f987.com\/en\/news\/pentagon-taps-anthropic-google-openai-and-xai-for-up-to-200m-in-ai-security-work\">signed<\/a> contracts worth up to $200 million with Anthropic, Google, OpenAI, and xAI for the development of AI solutions in the security sector. The department&#8217;s Chief Digital and AI Office planned to use their solutions to create agent systems. Among all contractors, only Anthropic&#8217;s tools were integrated into classified environments due to their high quality.<\/p>\n<p>However, by January 2026, the WSJ <a href=\"https:\/\/www.wsj.com\/tech\/ai\/anthropic-ai-defense-department-contract-947d5f33?mod=article_inline\">reported<\/a> on the risk of the agreement being terminated. Disagreements arose due to the startup&#8217;s strict ethical policy: Anthropic&#8217;s rules prohibit using the Claude model for mass surveillance and in autonomous lethal operations.<\/p>\n<p>Officials&#8217; dissatisfaction grew amid the <a href=\"https:\/\/u1f987.com\/en\/news\/pentagon-to-integrate-grok-chatbot-into-its-network\">integration<\/a> of the Grok chatbot into the Pentagon&#8217;s network. Defense Secretary Pete Hegseth, commenting on the partnership with xAI, emphasized that the department &#8220;will not use models that do not allow for warfare.&#8221;<\/p>\n<p>The situation <a href=\"https:\/\/u1f987.com\/en\/news\/wsj-u-s-military-used-claude-in-raid-to-capture-maduro\">escalated<\/a> in February 2026, when the U.S. Army used Claude in an operation to capture Venezuelan President Nicol\u00e1s Maduro.<\/p>\n<p>Anthropic CEO Dario Amodei <a href=\"https:\/\/www.anthropic.com\/news\/statement-department-of-war\">stated<\/a> that Anthropic would prefer not to cooperate with the Pentagon than agree to use its technologies in a way that could &#8220;undermine rather than protect democratic values.&#8221; He confirmed that the issue lies in the potential use of tools like Claude for two purposes: &#8220;domestic mass surveillance&#8221; and &#8220;fully autonomous weapons.&#8221;<\/p>\n<p>Anthropic&#8217;s principled stance led to increased popularity of the startup&#8217;s products, as users appreciated the developers&#8217; willingness to defend their interests.<\/p>\n<p><script async src=\"https:\/\/telegram.org\/js\/telegram-widget.js?23\" data-telegram-post=\"forklogAI\/6815\" data-width=\"100%\"><\/script><\/p>\n<p>Meanwhile, OpenAI faced criticism for its agreement with the Department of Defense. Subsequently, Sam Altman called the decision hasty.<\/p>\n<p><script async src=\"https:\/\/telegram.org\/js\/telegram-widget.js?23\" data-telegram-post=\"forklogAI\/6816\" data-width=\"100%\"><\/script><\/p>\n<h2 class=\"wp-block-heading\">Back to the Roots<\/h2>\n<p>The current negotiations between Google and the Pentagon are reminiscent of events in 2018. At that time, the corporation withdrew from Project Maven\u2014a program in which artificial intelligence was used to analyze drone footage.<\/p>\n<p>The company abandoned the project due to <a href=\"https:\/\/www.bbc.com\/news\/business-43656378\">strong employee dissatisfaction<\/a>. More than 3,000 people signed an open letter to management demanding the cessation of AI development for combat drones. This incident sparked a broad discussion in the tech industry about the permissibility of using algorithms for military purposes.<\/p>\n<p>The situation sparked a discussion in the tech industry regarding the use of AI for military purposes.<\/p>\n<p>Since then, Google has steadily rebuilt its relationship with the government. In 2022, the company <a href=\"https:\/\/publicsector.google\/\">created<\/a> a special division for working with the public sector.<\/p>\n<p>In 2024, the company signed a cooperation agreement in the field of AI with the Department of Defense. It was limited to the use of technology in non-classified areas.<\/p>\n<p>In early 2025, Google removed from its internal AI principles a clause that explicitly prohibited the use of technology in &#8220;weapons and surveillance systems.&#8221;<\/p>\n<p>The potential signing of a contract allowing work in classified environments signifies a move beyond ordinary &#8220;cooperation.&#8221; Such a step indicates readiness for full-scale integration of the company&#8217;s technologies into key defense operations.<\/p>\n<h2 class=\"wp-block-heading\">Employee Opposition<\/h2>\n<p>Google&#8217;s involvement in defense contracts continues to provoke internal dissatisfaction.<\/p>\n<p>In 2018, the company <a href=\"https:\/\/futureoflife.org\/open-letter\/lethal-autonomous-weapons-pledge\/?cn-reloaded=1\">pledged<\/a> never to work on military technologies. However, in April 2024, it <a href=\"https:\/\/time.com\/6966102\/google-contract-israel-defense-ministry-gaza-war\/\">became known<\/a> that cloud computing services and AI access were provided to the Israeli Ministry of Defense.<\/p>\n<p>In light of this, about 200 employees of DeepMind, Google&#8217;s AI-focused division, issued an official protest. They called for the termination of agreements with defense agencies. The authors of the initiative fear that the lab&#8217;s advanced developments are being transferred to armies engaged in real combat.<\/p>\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>&#8220;Any involvement in military and weapons production negatively impacts our position as a leader in ethical and responsible AI, and contradicts our mission and stated AI principles,&#8221; reads the document, which gathered signatures from about 5% of employees.<\/p>\n<\/blockquote>\n<h2 class=\"wp-block-heading\">Pentagon Insists<\/h2>\n<p>In February, the media <a href=\"https:\/\/www.reuters.com\/business\/pentagon-pushing-ai-companies-expand-classified-networks-sources-say-2026-02-12\/\">learned<\/a> that the Pentagon is urging leading AI companies to make their tools available in classified environments without the standard restrictions usually applied to users.<\/p>\n<p>Such networks are used for a wide range of confidential tasks, such as planning operations or targeting weapons. The military wants to leverage AI&#8217;s ability to synthesize information to aid decision-making.<\/p>\n<p>However, neural networks can make mistakes and fabricate information. Using such tools in combat conditions could have fatal consequences.<\/p>\n<p>AI companies strive to minimize the negative aspects of their products, but the Pentagon is frustrated with such limitations.<\/p>\n<p>The Department of Defense <a href=\"https:\/\/u1f987.com\/en\/news\/us-and-germany-enhance-ai-integration-in-military-sector\">intends<\/a> to accelerate decision-making with the help of artificial intelligence and make it a central element of its strategy to integrate sensors and strike systems in combat operations.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Google is discussing with the Department of Defense the integration of Gemini into Pentagon systems across all levels of information access, from open to strictly classified.<\/p>\n","protected":false},"author":1,"featured_media":96270,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"select":"1","news_style_id":"1","cryptorium_level":"","_short_excerpt_text":"Google and Pentagon discuss integrating Gemini into classified systems.","creation_source":"","_metatest_mainpost_news_update":false,"footnotes":""},"categories":[3],"tags":[438,738,1824],"class_list":["post-96269","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news-and-analysis","tag-artificial-intelligence","tag-google","tag-pentagon"],"aioseo_notices":[],"amp_enabled":true,"views":"13","promo_type":"1","layout_type":"1","short_excerpt":"Google and Pentagon discuss integrating Gemini into classified systems.","is_update":"","_links":{"self":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/96269","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/comments?post=96269"}],"version-history":[{"count":1,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/96269\/revisions"}],"predecessor-version":[{"id":96271,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/posts\/96269\/revisions\/96271"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media\/96270"}],"wp:attachment":[{"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/media?parent=96269"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/categories?post=96269"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/u1f987.com\/en\/wp-json\/wp\/v2\/tags?post=96269"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}