Hey friend 👋 — Annie & Téa here!
Welcome back to The Prompt Circuit — your weekly guide to what’s actually happening in AI, why it matters, and how you can use it today.

This week felt like a mix of “wow, that’s inspiring” and “hmm, that’s a reality check.” Bill Gates is putting big money into healthcare AI, SoftBank is quietly building the backbone of the entire AI world, and Wall Street reminded everyone that hype doesn’t pay the bills.

Here’s what caught our eye this week…

📰 This Week in AI

1. Bill Gates bets $1M on AI

Source: Gates Foundation / MIT Tech Review
Bill Gates launched the Alzheimer’s Insights AI Prize — a $1M competition for breakthroughs in dementia research. Winning models will be shared freely worldwide.

💡 Why it matters: AI isn’t just about chatbots or productivity tools — it’s being aimed at one of the hardest healthcare challenges.

👉 Your move: Keep an eye on healthcare AI — it’s one of the most promising spaces for real-world impact.

2. SoftBank’s billion-dollar AI play

Source: Financial Times
Masayoshi Son is betting big on infrastructure — backing a $500B data centre project (Stargate) and investing in chip companies.

💡 Why it matters: Fancy AI models are nothing without power grids, GPUs, and data centers. Whoever controls the “roads and highways” of AI controls the future.

👉 Your move: Don’t just watch the apps — follow the companies building the backbone (chips, cloud, energy). That’s where the long-term power lies.

3. AI stocks take a tumble

Source: Wall Street Journal
Nvidia’s stock dropped 5%. Palantir slid 16%. Investors are suddenly worried about “AI overvaluation.”

💡 Why it matters: Hype cycles rise and fall. Not every company with “AI” in its name is guaranteed gold.

👉 Your move: Stay sceptical — separate the companies doing real AI work from those riding the buzzword wave.

🔍 Demystifying AI – The Infrastructure You Don’t See

All the flashy models (ChatGPT, Gemini, Claude) rely on what you don’t see: data centres, GPUs, and power grids. Without that? Nada.

Think of it like lightbulbs and electricity — the bulb is cool, but the grid makes it possible. Whoever builds and controls that grid will decide how far (and how fast) AI goes.

🛠 Tool of the Week – Perplexity AI

Like ChatGPT, but every answer comes with sources.
Great for research when you want to trust what you’re reading.
👉 Try it: perplexity.ai

💬 Prompt of the Week

Drop this into ChatGPT, Claude, or Co-pilot when you’re planning your sprint:

“You are an agile project assistant. Create a 2-week sprint plan for a software development team of 5. Include backlog items, sprint goals, and a progress tracker in table format.”

👉 Works like magic when you need structure fast.

  • 📰 AI plagiarism crisis — Wired & Business Insider pulled articles after finding they were likely AI-written. (Guardian)

  • 🇨🇳 China restricts Nvidia AI chip sales amid U.S. tensions. (FT)

  • 💼 AI job fears — 71% of Americans worry about replacement, 77% about misinformation. (Reuters/Ipsos)

  • 💾 Intel gets $2B SoftBank boost — more proof infrastructure is the new battleground. (Reuters)

  • 📊 MIT study: 95% of corporate AI projects fail because of poor integration, not bad models. (MIT/Reuters)

🧩 Puzzle Corner

Last week’s answer: Data (structured like a table, unstructured like a tweet).

This week’s puzzle:
“I power your AI but you can’t see me. I’m stored in chips, not clouds. What am I?”

(Answer in next week’s issue 👀)

🔐 Mini-Guide: Protecting Your AI Prompts & Data

Last week, we promised you a quick guide on how to keep your prompts and data safe when using AI tools — here you go 👇

AI feels like magic, but remember: your inputs are often logged or even used to train future models. That means sensitive info (like client data, business plans, or personal details) could end up in places you didn’t intend.

Here’s how to stay protected:

1. Treat prompts like emails
👉 If you wouldn’t put it in an email to a stranger, don’t paste it into ChatGPT (or any AI).

2. Use enterprise or pro accounts
👉 Paid tiers (like ChatGPT Enterprise, Claude for Teams, Microsoft Copilot) usually come with stronger privacy protections.

3. Run sensitive work locally
👉 Tools like Ollama let you run AI on your own machine — no cloud logging.

4. Mask or anonymize data
👉 Swap out names, IDs, or addresses with placeholders before sharing.

5. Read the fine print
👉 Some tools train on your data, others don’t. Always check their policy.

💡 Your move: Strip out sensitive details, and when it really matters, use privacy-first tools.

Till next time, stay curious

The Prompt Circuit

Keep reading

No posts found