đŸ€©Microsoft’s New AI Needs No GPUs

PLUS: OpenAI Debuts o3, o4-mini

In partnership with

Reading time: 5 minutes

Today we will discuss:

You’ve heard the hype. Now it’s time for results

After two years of siloed experiments, proofs of concept that fail to scale, and disappointing ROI, most enterprises are stuck. AI isn't transforming their organizations — it’s adding complexity, friction, and frustration.

But Writer customers are seeing a positive impact across their companies. Our end-to-end approach is delivering adoption and ROI at scale. Now, we’re applying that same platform and technology to bring agentic AI to the enterprise.

This isn’t just another hype train that doesn’t deliver. The AI you were promised is finally here — and it’s going to change the way enterprises operate.

See real agentic workflows in action, hear success stories from our beta testers, and learn how to align your IT and business teams.

GoogleImages

Key Points 

  • BitNet b1.58 2B4T compresses model weights into just three values, making it faster and more memory-efficient.

  • The model outperforms Meta’s Llama, Google’s Gemma, and Alibaba’s Qwen models in math and commonsense tests.

đŸ‘šâ€đŸ’»News - Microsoft researchers have developed BitNet b1.58 2B4T, the largest 1-bit AI model created so far. It is now available under an MIT license and can run on CPUs, including Apple's M2 chip. Unlike most AI models that need powerful GPUs, this one is designed to work on more everyday hardware.

đŸ‘ŸWhat are Bitnets & Why they matter - Bitnets are a new kind of compressed AI model. They work by simplifying the model’s internal values, called weights, into just three options: -1, 0, and 1. This makes them much faster and more memory-efficient than traditional models, which usually need more bits to store their weights.

đŸ€–How it performs - BitNet b1.58 2B4T has 2 billion parameters and was trained on a massive dataset of 4 trillion tokens, about the same as 33 million books. In tests, it performed better than other well-known models like Meta’s Llama 3.2 1B, Google’s Gemma 3 1B, and Alibaba’s Qwen 2.5 1.5B on tasks like math problems and commonsense reasoning. It also runs faster and uses much less memory compared to models of a similar size.

To hit those impressive numbers, BitNet relies on Microsoft’s custom framework, bitnet.cpp, which is currently limited to select hardware, a major gap given today’s AI ecosystem.

🌟The bottom line - Bitnets could be a game-changer for running AI on resource-constrained devices, but broader compatibility challenges still loom large.

We’ve just launched the 15th edition of Workflow Wednesday for AI-minded professionals like you—actionable AI workflows delivered straight to your inbox.

This week’s topic: AI Problem Solving

GoogleImages

Key Points 

  • OpenAI launched o3 and o4-mini, offering advanced reasoning, image processing, Python execution, and web browsing features.

  • Along with the models, OpenAI introduced Codex CLI, a tool for developers to integrate AI with local coding tasks.

🚀News - OpenAI has launched two new AI reasoning models, o3 and o4-mini. 

o3 is being touted as the company’s most advanced model yet, outperforming earlier versions across math, coding, reasoning, science, and visual understanding tests. On the other hand, o4-mini offers a strong balance of price, speed, and performance, making it an appealing choice for developers.

Both models can now perform tasks using ChatGPT tools like web browsing, Python code execution, and image generation. They also bring a major update: the ability to analyze images. Users can upload images, such as whiteboard sketches or diagrams, and the models will process them during their reasoning phase, even handling blurry or low-quality images.

✹What’s more? Alongside these models, OpenAI introduced Codex CLI, a new coding agent designed to run locally from terminal software. Codex CLI connects OpenAI’s models with local coding tasks, allowing them to write, edit code, and even move files. While it doesn’t fully build apps just yet, it’s a step toward OpenAI’s goal of creating an “agentic software engineer” that can handle complete app development projects. Read more here.

đŸ€“What’s next? In the coming weeks, OpenAI will also release o3-pro, an upgraded version of o3 for ChatGPT Pro users that will use more computing resources for its responses.

đŸ™†đŸ»â€â™€ïžWhat else is happening?

đŸ‘©đŸŒâ€đŸš’Discover mind-blowing AI tools

  1. Learn How to Use AI - Starting January 8, 2025, we’re launching Workflow Wednesday, a series where we teach you how to use AI effectively. Lock in early bird pricing now and secure your spot. Check it out here

  2. OpenTools AI Tools Expert  - Find the perfect AI Tool to solve supercharge your workflow. This GPT is connected to our database, so you can ask in depth questions on any AI tool directly in ChatGPT (free)

  3. Presentations.AI - An AI-powered tool that enables users to create professional and engaging presentations quickly and effortlessly

  4. ChefGPT - An AI-powered recipe recommendation tool that suggests recipes based on the ingredients and tools you have

  5. 60sec.site - A no-code website builder that allows users to create a landing page in just 60 seconds

  6. WatchThis.Dev - A tool that provides users with curated show or movie recommendations

  7. BaruaAI - An AI-powered platform that generates personalized, high-converting sales emails

  8. Dupe Any Product - A tool that helps users find affordable dupes of home decor items

  9. PDF Candy - Allows users to convert files to and from PDF format ($3.60/month)

  10. Turbologo - An online logo maker tool that helps people to create professional and high-quality logos in minutes

How likely is it that you would recommend the OpenTools' newsletter to a friend or colleague?

Login or Subscribe to participate in polls.

Interested in featuring your services with us? Email us at [email protected]