Operations | Monitoring | ITSM | DevOps | Cloud

Top tips: When leaders leave, here's how to keep your IT systems stable

Top Tips is a weekly column where we look at what’s shaping the tech world and share practical ways teams can stay prepared for what’s next. This week, we’re focusing on a situation many teams underestimate—what happens to your IT systems when a key leader steps away, and how you can build stability that doesn’t rely on any one person. Some problems don’t show up when things are running smoothly. They show up when someone leaves.

Run Local LLMs on Mac to Cut Claude Costs

Part of the motivation for this post is how cloud API economics are shifting: Anthropic is moving large enterprise customers toward per-token, usage-based billing (unbundled from flat seat fees), which makes “always call the API” a moving cost line for teams at scale. A hybrid or local layer is one way to keep spend bounded while you still use premium models where they matter.

Collective IQ DEX: complete visibility into the employee digital experience

Collective IQ DEX (CIQ DEX) provides a unified view of the employee experience across different devices, locations, applications, collaboration tools, and even user sentiment. The platform combines intelligent IT asset analysis and employee perception, with drill-down capability to the level of each user and endpoint.

Human First, AI Second: Cycle's Approach to AI Coding in 2026

It is easier than ever to launch a product from scratch. Today, AI can make your team of two feel like a team of ten almost overnight. Enterprises across the tech industry are completely restructuring engineering teams to double down on AI coding, often incentivizing engineers for the sheer amount of code they push. The AI revolution is incredible. So, you would be crazy not to hop on the vibe coding train right? Well it depends on what exactly you are building.

Voices You Can't Trust: Securing K-12 Communications Against AI Deepfake Threats

It starts with a voice you recognize. A call from the superintendent asking for an urgent update. A voicemail from a principal requesting sensitive student information. A message that sounds authentic, because it is, at least on the surface. The tone, cadence, and even the subtle inflections are exactly right. But the request isn’t. AI-powered deepfakes are rapidly reshaping the threat landscape for K–12 schools, turning trusted communication channels into potential points of vulnerability.

Cloudsmith raises $72M Series C to secure the AI software supply chain

Cloudsmith raised $72 million in Series C funding, led by TCV and Insight Partners, to build the operating system for the modern software supply chain. AI agents are writing code faster than teams can secure it. That shifts the risk calculus because more software, built faster, means more attack surface. Artifact management is the control point between every software producer and consumer, and it's where Cloudsmith sits.

Rootly's Dan Sadler: why AI coding tools are driving more incidents + why reliability is the product

Cortex co-founder and CTO Ganesh Datta sits down with Dan Sadler, VP of Engineering at Rootly. Dan explains how Rootly treats reliability as a product feature rather than just a technical metric, and why culture might be the most impactful element of building reliable systems.

Beyond Uptime: Building a Self-Healing OpenClaw Observability Stack

The allure of OpenClaw is undeniable. You deploy a highly autonomous, self-hosted AI agent, give it access to your repositories and inboxes, and watch it reason through complex workflows while you sleep. It is the dream of the ultimate 10x developer tool realized. But as any veteran DevOps engineer will tell you: running an LLM-backed Node.js agent in production is vastly different from testing it on your local machine.

Why Mandating AI Tools Backfires on Engineering Teams

Responsible AI adoption for engineering teams starts with culture, not compliance. In this GitKon talk, Rizel Scarlett (Tech Lead of Open Source DevRel at Block) shares how Block helped thousands of engineers actually want to use AI tools, including Goose, Cursor, Claude Code, and more, without mandates, vibe coding disasters, or security gaps.