TLDR Trends: November 2025
In this episode, we unpack three major shifts shaping tech and marketing right now — all powered by first-party engagement data from TLDR’s millions of readers. Key trends: the tech job market, AGI timelines, and AI’s impact on software engineering.
Key trends
- The tech job market: Big Tech is generating more revenue with fewer employees. As AI accelerates efficiency, marketers need to rethink how they position automation tools — not as job replacements, but as task enablers.
- AGI timelines: Andrej Karpathy calls this the decade, not year, of AI agents. Silicon Valley is recalibrating expectations around AGI timelines, signalling that marketers should move beyond “AI agent” hype cycles to more grounded messaging.
- AI’s impact on software engineering: Developer tools are leading the AI adoption curve. From GitHub Copilot’s massive user base to “vibe coding” trends, engineers are learning to manage parallel AI agents and reimagine the software stack itself.
- Chris and Dan also share which ad creatives are performing best in TLDR right now — hint: the winners pitch specific use cases, not vague “AI does everything” promises.
- Read Transcript
Dan Ni (00:00)
The number one mistake we see marketers make here is that they pitch AI tooling as a direct replacement for someone’s job.Chris Chan (00:06)
Welcome to the first episode of TLDR Trends. In this monthly series, we go over the questions: What are TLDR readers actually engaging with, and why should marketers care? My name is Chris, I lead marketing at TLDR, and I’m with TLDR’s founder, Dan. Dan, welcome to TLDR Trends.Dan Ni (00:25)
Excited to be here.Chris Chan (00:27)
Let’s get right into it. We have three big topics today. Let’s start with the tech job market. Why has this been getting so much attention lately?Dan Ni (00:34)
Yeah, so one of the most popular articles from the past month in TLDR was called The Great Decoupling of Labor and Capital. What this article talks about is how big tech companies today are so much more efficient than big tech companies of the past.HP was the first big tech company to get to $100 billion in annual revenue, and it took them 172,000 employees to get there. IBM was next and it took them 400,000 employees. Now look at this chart for Alphabet: Alphabet got to $100 billion in annual revenue with just 76,000 employees. The next $100 billion took 64,000 additional employees, the next 100 billion took 42,000 additional employees, and the most recent 100 billion took only 11,000 incremental employees.
You can see charts like this for all the other big tech companies. Today’s big tech companies are just so much more efficient than those in the past.
Relatedly, Amazon announced the biggest corporate layoffs in company history—laying off 30,000 corporate workers. Microsoft laid off 15,000 corporate workers earlier this year, and Meta and Google did layoffs too. There are a few reasons for these layoffs: all these companies over-hired during COVID. As you can see from this chart, Amazon doubled its headcount in just two years during COVID. In 2019, they had about 750,000 employees; the number peaked in 2021 at 1.6 million and decreased slightly since then.
Their CEO, Andy Jassy, said they need to remove layers of management to move faster and operate like the world’s biggest startup.
The other elephant in the room is obviously AI. CEOs don’t love saying AI is the root cause, but there are a lot of signs that AI is already impacting employment. Google and Microsoft have both said that AI is now writing over 30% of the code at their companies. Shopify’s CEO wrote a famous leaked memo this past April basically saying you have to justify why AI can’t do a job before you’re allowed to hire a human.
When you talk to people in the Bay Area, pretty much everyone will tell you their teams are being told to use AI and run much leaner.
Chris Chan (02:55)
So what’s the biggest takeaway here for marketers?Dan Ni (02:58)
The number one mistake we see marketers make is pitching AI tooling as a direct replacement for someone’s job.You see this everywhere: “This is your AI SDR,” “This is your AI marketer,” etc. The problem is, if you pitch your tool as replacing someone’s job, that person will fight you—and since the tooling isn’t fully there yet, they’ll probably win.
Where we’ve seen marketers succeed is in pitching AI tools as automating tasks, not jobs. This not only more accurately reflects what the tools can do, but it also frames them as a benefit, not a threat. Everyone has boring, time-consuming parts of their job. If an AI agent can do your data entry, organize your emails, or write a draft, who would say no to that? That’s the framing marketers should use.
Chris Chan (03:57)
Let’s move on to the next topic: AGI timelines. Dan, why don’t we start with what AGI is?Dan Ni (04:05)
AGI stands for Artificial General Intelligence. Today’s AI is much more general than the AI of the ’90s, when systems were built just to play chess. Now AI can write emails, do calculations, etc. Only recently have they been able to count the number of Rs in the word “strawberry,” famously. We still have a ways to go before we reach true AGI, where an AI system can do everything at a human level or above.This is a huge topic in Silicon Valley because there’s a lot of debate around how long it will take to reach AGI. One of the most popular stories in TLDR this past month was a tweet thread by Andrej Karpathy.
Karpathy, one of the co-founders of OpenAI and former head of AI at Tesla, is a universally respected AI researcher. He basically said he thinks AGI is about 10 years out. Instead of this being “the year of agents,” he said this is actually “the decade of agents,” meaning the rollout of AI will go much slower than people think.
This struck a chord because, as crazy as it sounds, 10 years is actually bearish compared to what many in frontier labs like Anthropic believe. Anthropic’s CEO publicly predicted AGI by early 2027.
If you look at this chart—one a lot of people reference—it tracks how long an LLM can work autonomously on a software engineering job while being correct 50% of the time. When GPT-4 came out, it could do tasks that were 2–5 minutes long with 50% success. Fast-forward to GPT-5, Claude Sonnet 4.5, Grok 4—this new generation can work for hours autonomously. The curve has been rising rapidly.
What’s interesting about Silicon Valley culture is that people tend to extrapolate: “If the line is going up, of course it will keep going up.” The debate on AGI timelines really centers on whether AI itself will accelerate AI research. If researchers use AI tools to build AI, will the curve slope upward even faster, or continue linearly?
Chris Chan (07:29)
Why should marketers care about the AGI timeline?Dan Ni (07:32)
If it’s truly the decade of agents, not the year of agents, then we may have passed peak hype for AI agents. People today have a more sober view of what’s actually possible. In the next few months, we may see “agent fatigue,” where taglines like “your AI sales agent” or “your AI design agent” won’t perform as well as they did six months ago.The underlying idea won’t go away, but in 3–5 years we might be using a different buzzword for an AI that works with minimal supervision.
Chris Chan (08:22)
Let’s move on to our last topic. What can you tell me about AI in software engineering?Dan Ni (08:28)
One place where AI tools are really working is software engineering. Coding is a task where AI can easily check its own answers by running the code. So we can train AI to be great at coding much faster than at tasks like writing poetry, where it can’t easily evaluate itself.Because these tools work so well, developers have adopted them en masse. Over 20 million people have tried GitHub Copilot alone. Many of the fastest-growing AI companies—Cursor, Claude Code—are developer tools. Developers are writing about how their workflows and day-to-day are changing because of AI tools.
One popular article was about running parallel AI agents. In the GPT-3 era, engineers used AI as an advanced autocomplete. In today’s era, models can work autonomously on tasks for up to half an hour. Engineers are becoming more like engineering managers: instead of writing code linearly, they give requirements to multiple AI agents working in parallel, and then review their work—almost like code reviews for AI.
Another interesting article was Designing APIs for Vibe Coding. Vibe Coding is where you let AI run as autonomously as possible with minimal supervision. Many coding tools have an “accept all changes” setting, and you trust the AI to do its thing. But AI hallucinations are a real problem—it will write code for API endpoints that don’t exist.
The article argues that if you want APIs optimized for vibe coding, you should design them the way AI tools expect them to work. Conform as much as possible to similar APIs, since LLMs are trained on them. As a check, you should even ask multiple LLMs—ChatGPT, Gemini, Claude—how they’d design your API before you build it.
Chris Chan (11:37)
What tips do you have for marketers who are marketing a dev tool?Dan Ni (11:41)
In the developer world, every part of the stack is being rethought because developers are changing their architectures and workflows. AI is also making more of the decisions as developers delegate to agents. Tools that are friendly and familiar to LLMs are seeing huge adoption.Look at this chart of Supabase’s journey to 3 million developers. You can see the inflection point where ChatGPT started recommending Supabase as a default option. Tools selected “by default” by LLMs are going vertical because they’re automatically added to millions of projects.
Chris Chan (12:31)
Before we wrap up, let me ask you a couple of questions advertisers care about. What trends are you seeing with the most successful ads?Dan Ni (12:40)
Looking at the most successful ads in TLDR this past month, the marketers pitching AI tools with very explicit use cases are getting the best results. A common user experience is a landing page that says the tool can do anything. But the advertisers seeing the most success are prescriptive about the primary use cases and guide users to where the value is.Chris Chan (13:11)
Outside of AI, what other topics are TLDR readers engaging with?Dan Ni (13:15)
Great question. We’re seeing a lot of engagement with real-world crypto use cases—prediction markets and stablecoins. Prediction markets like Polymarket and Kalshi are going vertical, driven by election betting. The New York Times’ parent company recently signed on to invest up to $2 billion into Polymarket, making its founder the youngest billionaire in the world.Stablecoins are being adopted by financial institutions—JP Morgan, PayPal, Bank of America, Stripe—because of how quickly and cheaply they move money. Tether, the biggest stablecoin by market cap, is rumored to be raising capital at a half-trillion-dollar valuation. We’ll see more intersection between crypto and traditional finance soon.
Chris Chan (14:14)
Thanks for joining us, Dan, and thanks to everyone watching.If you’re interested in advertising with us, reach out to partnerships@tldr.tech. Or if you want to ask Dan a question, send them to trends@tldr.tech. Thanks again.
























