Abstract geometric illustration representing AI-driven job interviews

Amazon's AI Will Interview You Now: Connect Talent and the 'Humorphism' Gamble

Your next job interviewer might not have a pulse. On April 28, Amazon unveiled Connect Talent — an agentic AI system that finds, screens, interviews, and evaluates job candidates around the clock. No human involvement until the final hiring decision. It’s not a chatbot on a careers page. It’s a full-stack autonomous hiring agent, and Amazon is selling it to every enterprise that hires at scale. They even coined a philosophy to soften the blow: humorphism. ...

April 29, 2026 · 4 min · DBBS Tech
Abstract geometric illustration of breaking chains between cloud platforms

OpenAI Breaks Free from Microsoft and Lands on AWS Overnight

On Sunday, OpenAI and Microsoft announced the end of their exclusive partnership. By Monday, OpenAI’s models were live on Amazon Web Services. The speed tells you everything about how long this was in the making. This isn’t a cloud provider swap. It’s OpenAI declaring independence — while juggling missed revenue targets, an $852 billion valuation, a looming IPO, and a trial against Elon Musk happening simultaneously. What Actually Changed On April 27, OpenAI and Microsoft jointly announced a restructured deal: ...

April 29, 2026 · 5 min · DBBS Tech
Amazon's dual AI investment strategy with Anthropic and OpenAI

Amazon Just Bet $25 Billion on Anthropic — While Already Backing OpenAI

There’s a saying in venture capital: if you can’t pick the winner, fund the race. Amazon just applied that maxim with a $75 billion budget. Amazon announced it will invest up to $25 billion in Anthropic — the company behind Claude — as part of a sprawling deal committing Anthropic to spending over $100 billion on AWS over the next decade. This comes just two months after Amazon dropped $50 billion on OpenAI’s record-breaking funding round. ...

April 21, 2026 · 5 min · DBBS Tech
Abstract visualization of disaggregated AI inference architecture

AWS and Cerebras Are Ripping AI Inference Apart — On Purpose

The biggest bottleneck in AI isn’t training anymore. It’s inference — the moment a model actually does something useful. And AWS just partnered with Cerebras Systems to attack that bottleneck with an approach nobody has tried at this scale. The deal: Cerebras’ massive wafer-scale CS-3 chips will sit inside AWS data centers, accessible through Amazon Bedrock. The promise: 5x faster inference. The method: tearing the inference pipeline in half. Splitting the Brain Traditional AI inference runs both stages on the same GPU. You send a prompt, the chip processes it (prefill), then generates a response token by token (decode). One chip, both jobs. ...

March 22, 2026 · 4 min · DBBS Tech