šŸ‡ŗšŸ‡ø China’s DeepSeek Defies U.S. Bans

PLUS: OpenAI’s Hitachi deal & bankers training their AI replacements

In partnership with

Reading time: 5 minutes

Hey Opentoolers,

You asked for more stories, we listened. This edition’s packed so you can stay sharp without doomscrolling 15 tabs. We’re keeping it fast, punchy, and useful — the way AI news should be. Think of it as your edge in a world where the pace won’t slow down, but we’ll make sure you don’t fall behind.

šŸ—žļøIn this edition

  • China’s DeepSeek Defies U.S. Bans

  • Sponsored: Pressmaster – AI for cognitive amplification

  • Hitachi Powers OpenAI’s Data Centers

  • Bankers Training Their Own AI Replacements

  • Workflow Wednesday #41 ā€˜AI & Human Creativity’

  • In other AI news –

    • IBM taps Groq for high-speed AI inference

    • AI in automotive forecast to hit $21B by 2030

    • Universities wrestle with AI’s role in learning

    • 4 must-try AI tools

Source: Asia Times

What's happening:

DeepSeek, a Chinese startup, released R1—an AI model that rivals OpenAI's ChatGPT for under $6M in training costs. The founder stockpiled over 10,000 banned Nvidia chips before export controls hit, then built efficiency tricks to compensate for weaker hardware.

Nvidia lost nearly $600B in market cap when the news broke. Reports claim DeepSeek has 50,000 GPUs total, including banned H100s potentially acquired through shell companies.

Huawei's also showcasing new domestic chips. Beijing's now urging Chinese tech giants to stop buying Nvidia's compliance chips entirely.

Why this is important:

U.S. export controls were supposed to slow China's AI progress. Instead, they're forcing Chinese companies to innovate around constraints—building efficient models that do more with less.

DeepSeek built on Meta's research and uses Nvidia chips, so it's not fully independent. But the combination of algorithmic efficiency and stockpiled hardware proved the "small yard, high fence" strategy has gaps.

Our personal take on it at OpenTools:

Export controls assumed compute was the bottleneck. DeepSeek proved software efficiency matters more than people thought.

Trump called it a "wake-up call." He's right. China's not just catching up—they're doing it cheaper and open-sourcing the results, which builds soft power globally.

The U.S. bet everything on hardware advantage. China's optimizing around it and winning developer mindshare in the process.

The Future of AI Communication Starts Here

Every day great ideas fade because turning complex thinking into clear communication takes time most people do not have.

Pressmaster.ai changes that. It captures how you think, amplifies it into powerful communication, and shares it in your authentic voice.

This is not about writing faster. It is about thinking louder.

Join us on October 15 at 12 PM ET or 6 PM CET for The Worlds First AI Built for Cognitive Amplification.

Discover how one short conversation becomes thirty days of strategic and market ready content.

Be part of the launch of a new category where AI becomes part of your cognitive architecture.

Do you like the new OpenTools Newsletter Format?

Login or Subscribe to participate in polls.

Source: Seeking Alpha

What's happening:

Hitachi and OpenAI signed a deal to build next-gen AI data centers globally.

The focus: zero-emission facilities, securing long-lead equipment, and modular designs that cut construction time. Hitachi's investing $1B+ in U.S. transformers and high-voltage gear.

OpenAI gets power grids, cooling tech, and operational expertise. Hitachi integrates OpenAI's LLMs into their systems.

Why this is important:

OpenAI's worth $500B but isn't profitable. The bottleneck isn't just compute—it's infrastructure.

Data centers need power, cooling, and speed. Hitachi solves all three while addressing energy concerns before regulators force the issue.

This is the unglamorous buildout that makes AI scale possible.

Our personal take on it at OpenTools:

Everyone obsesses over models. No one talks about whether the grid can handle it.

The AI race is about securing physical infrastructure at scale. Modular data centers and zero-emission targets give OpenAI the foundation they need while staying ahead of regulatory pushback.

Boring infrastructure makes exciting AI possible.

Source: Fintech Weekly

What's happening:

OpenAI hired 100+ ex-bankers from JPMorgan, Goldman, Morgan Stanley for "Project Mercury." They're paid $150/hour to train AI on financial modeling—IPOs, restructurings, LBOs.

The work junior analysts grind 80-hour weeks on in Excel.

Application: AI chatbot interview, financial tests, modeling tests. Submit one model per week, get early access to OpenAI's finance tools.

Why this is important:

Junior banking's getting automated. The "pls fix" culture—endless Excel tweaks—is exactly what LLMs handle.

OpenAI needs commercial wins to justify their $500B valuation while still being unprofitable. Finance is the obvious target.

Real bankers training real tools means this isn't theoretical anymore.

Our personal take on it at OpenTools:

The irony: junior bankers training the AI that replaces them for $150/hour.

If AI handles grunt work, entry-level roles either disappear or evolve. But banking's apprenticeship model assumes you earn your stripes in Excel hell before moving up.

OpenAI's solving for efficiency. Banks need to solve for succession. Analyst classes will be smaller in five years.

This Week in Workflow Wednesday #41: AI & Human Creativity

Workflow #1: From Creative Thoughts to Instagram Post with Canva AI (free).
 Step 1: Head to Canva AI → Text-to-Image. Type the idea you’ve been sitting on.
Step 2: Generate 4–5 variations. Go back into the Canva AI and insert the prompt……… we explore this and 2 more workflows inside this week’s edition of Workflow Wednesday.

  • IBM taps Groq for high-speed AI inference // IBM is integrating Groq’s inference hardware into watsonx, claiming 5Ɨ faster runtime for real-time AI tasks like fraud detection and customer support. A play to win on inference speed, not just model size.

  • AI in automotive forecast to hit $21B by 2030 // The automotive AI market is projected to grow from $6.2B in 2025 to $21B by 2030, driven by predictive maintenance, smarter in-car systems, and regulatory pushes for safety and emissions.

  • Universities wrestle with AI’s role in learning // Nature reports colleges are rolling out AI agents for course guidance and tutoring, sparking debate: does this supercharge education or weaken critical thinking skills?

šŸ‘©šŸ¼ā€šŸš’Discover mind-blowing AI tools

  1.  Careered - An online platform that helps users creates cover letters from resumes and job listings

  2. Wav2Lip for Automatic1111 - A tool that generates lip-sync videos by combining a video and a speech file

  3. Motionagent - An AI assistant that helps users convert their ideas into motion pictures

  4. AI Poem Generator - AI-powered tool that generates unique rhyming poems on any subject

P.S.

That’s a wrap for today. We’re testing this new style — more stories, sharper takes, less fluff. If you’re into it (or think we missed the mark), hit reply and let us know. Your feedback keeps this newsletter worth opening.

How did we like this version?

Login or Subscribe to participate in polls.

Interested in featuring your services with us? Email us at [email protected]