All ai construction economics energy game theory infrastructure iran oil pipelines
BM
@breakingmetrics
Apr 1, 2026 · 8:09 PM
ai

Oracle cut 30,000 people yesterday to fund AI data centers. Their net income last quarter was up 95%. This wasn't a struggling company cutting to survive. It was a profitable company eliminating the workforce that AI is replacing. The economy is splitting in two. The real question is where's the money actually going?

657620292_17958044583099555_6887814945914197343_n.webp

Here's the split. The Oracle employee who got a termination email at 6am yesterday has a college degree, a desk job, and a skillset that a language model can replicate for a fraction of their salary. The laborer pouring the concrete foundation for the data center that replaced them has none of those problems. You can't offshore a concrete pour. You can't automate an ironworker framing the structure. The trades are about to have the best decade they've seen in a generation.

657712853_17958045000099555_20444795064561167_n.webp

But it won't last forever. Data center construction is a build cycle, not a permanent employment base. You build it, you commission it, and the crew moves on. What stays behind is a facilities and maintenance workforce that's smaller, more specialized, and (sometimes) better paid than what came before. The construction boom is real, but so is the cliff at the end of it.

658879721_17958044991099555_4876238302409077613_n.webp

The reality for the next five years: white collar tech employment contracts sharply while trades employment surges, then plateaus at a higher baseline than before. The real question is, where do these laid off workers go?

This is what Breaking Metrics tracks - where capital is actually flowing and who it leaves behind. If that's worth following, check it out here: breakingmetrics.substack.com

Breaking Metrics | Substack
Civil engineer and investor writing about markets, geopolitics, and how real-world systems break. Click to read Breaking Metrics, a Substack publication with hundreds of subscribers.
breakingmetrics.substack.com

AI isn't the internet. ai Mar 24, 2026
BM
@breakingmetrics
Mar 24, 2026 · 8:49 PM
ai ★ Featured

AI isn't the internet. It's not going to be for everyone. What we're watching right now is an industry figuring out what it actually is. The consumer market is nice to have. Department of War discretionary money is even better.

16032.jpg

OpenAI raised $120 billion and killed a consumer product in the same week. The company matured and realized where the real money is. Palantir and Anduril already proved it. There is enormous demand for autonomous systems, battlefield AI, and drone intelligence at the Department of War level. Public agencies have deep pockets, long contracts, and no price sensitivity.

16035.png

Every major AI company is going to be chasing some version of that contract over the next two years. The consumer subscription stays, same reason HD Supply or Grainger still sells to the occasional homeowner. But that's not the business. The business is the GC account tied to a $500 million public contract. Anyone valuing these companies on subscriber counts is reading the wrong line item. I discuss industry, materials, energy, and more: https://www.breakingmetrics.com

Breaking Metrics — Real Economy Intelligence
Independent intelligence covering industry, construction, manufacturing, and energy. Newsletter, market tools, and data platforms.
www.breakingmetrics.com

BM
@breakingmetrics
Mar 3, 2026 · 12:14 AM
ai

Sam Altman just admitted he was "sloppy" and "opportunistic" securing a $200M Pentagon AI contract. The language is being amended. What the contract already permits is not. Those permissions are being stress tested in live combat over Iran right now.

640437381_17951872023099555_3015641652168475807_n.jpg

I've built bridges and managed nine-figure infrastructure contracts for government agencies across the NYC metro area. The first thing every experienced contractor learns is that silence in a contract is not neutrality. Agencies don't leave ambiguity in specs by accident. They leave it on purpose.

OpenAI signed a national security contract with language Altman now admits was careless. Government agencies have entire legal teams dedicated to contract language. A CEO focused on closing a $110 billion funding round the same week was not reading the fine print.

The U.S. military is now running AI targeting systems processing strike decisions faster than human commanders can review them. Anthropic refused to remove human oversight conditions from their contract. OpenAI removed them to close the deal.

The conditions Anthropic refused to remove are now central to how AI is being used in live combat. The military used Anthropic for strike planning in Iran even after the contract dispute. The question of who sets the boundaries on autonomous targeting - the vendor or the Pentagon - remains unanswered.

Facing backlash, OpenAI’s Sam Altman says he made a ‘sloppy’ mistake in Pentagon deal
The OpenAI CEO admits he came off as “opportunistic” in how he announced the company’s new government deal and says he will amend the language to emphasize restrictions
glideslope.ai

BM
@breakingmetrics
Mar 2, 2026 · 12:16 AM
ai

The Pentagon just handed a $200M AI contract to a company TechCrunch says is unequipped to handle national security responsibilities. I've spent 16 years watching government agencies pick vendors. This is what happens when urgency overrides due diligence.

642660545_17951768406099555_7888644703649426081_n.jpg

Transportation agencies figured this out decades ago. Public-private partnerships use a slow, structured interview process to select design-build teams precisely because the cost of picking the wrong vendor is catastrophic. The DoD just skipped that entire playbook to move fast on AI.

When a government agency is under pressure to move fast, vetting goes out the window. The contract gets awarded to whoever says yes first, compliance conditions get stripped and the ethical guardrails get negotiated away. The agency gets what it needs today and inherits the risk tomorrow.

Anthropic walked away because they understood the weight of it. OpenAI walked in because the money was right. The real question nobody is asking: can OpenAI actually build a model capable of handling national security operations at the level this requires? It appears the military has just bet hundreds of millions on the answer being yes.

No one has a good plan for how AI companies should work with the government
As OpenAI transitions from a wildly successful consumer startup into a piece of national security infrastructure, the company seems unequipped to manage its new responsibilities.
glideslope.ai

BM
@breakingmetrics