Five Palestinians were killed, including two children, and several wounded when Israeli aircraft struck tents for displaced people west of Khan Younis, according to medics at the Kuwait Field Hospital.
The strikes hit Gaza’s coastal al-Mawasi…

Five Palestinians were killed, including two children, and several wounded when Israeli aircraft struck tents for displaced people west of Khan Younis, according to medics at the Kuwait Field Hospital.
The strikes hit Gaza’s coastal al-Mawasi…

Immigration groups and lawmakers are sharply criticizing Donald Trump’s latest move to halt immigration applications from 19 countries already under US travel restrictions, a decision that comes amid reports that naturalization ceremonies for…

Clint Bentley’s Train Dreams is a quiet film — a meditative frontier portrait defined as much by its silences as by its sweeping Northwest landscapes and its attention to the everyday rhythms of a working man’s life. For star Joel…

December ushers in a period of reflection in the investment world, as investors take stock of the previous year and begin to position themselves for the year to come. This is more true than ever right now, as we seem to be in a liminal period; animal spirits have lulled, but AI companies continue to put up strong results.
My prediction for 2026 is that it will be a tale of two AIs. On the one hand, it will be a year of delays, first in data center buildouts, many of which will fall behind schedule, and second, in the AGI timeline. At the same time, AI adoption will continue its relentless rise. In 2025, startups coined the idea of a “$0 to $100M” club of rapidly scaling AI companies; in 2026, we’ll begin to talk about the “$0 to $1B” club.
Entering 2026, here are the facts as I see them:
Tale 1: The Year of Delays
These countervailing forces will collide in 2026: soaring Big Tech demand will run headfirst into a supply chain that hasn’t scaled fast enough to match it.
First, companies like TSMC and ASML have monopolistic positions and cannot be forced to ramp capacity. Ben Thompson has called this the “TSMC Brake,” pointing out in October that while TSMC had ramped revenues by 50% since 2022, they had only ramped CapEx by 10%. He explained further: “There weren’t too many answers from TSMC about this, which is understandable, given that they won’t announce next year’s CapEx numbers until next quarter. What Wei did say is that TSMC was making a point to not just talk to its customers but its customers’ customers.” My prediction, especially coming off of the successful Gemini 3 launch and hype around TPUs, is that the TSMC constraint could become material in 2026.
Second, industrial players, which tend to be overlooked due to their fragmentation and lack of market power, may end up creating bottlenecks as data centers move into the final stages of construction. Generators and cooling units are among the most important industrial inputs to data centers, but there are dozens of such inputs; if any of these inputs are delayed, timelines would need to be pushed out. There are also labor constraints that must be factored in, as shortages in skilled labor could become a key bottleneck for completing these immense construction projects. Many AI companies share a supply base, and these industrial suppliers are faced with their own CapEx decisions (how many new factories to build). We’ll find out in 2026 to what extent they’ve sufficiently added to their own output capacity.
The average AI data center takes roughly two years to build. So if 2024 was the year of new project announcements, and 2025 was the year when construction investments started to hit GDP, then 2026 will either be the year where a lot of this new capacity comes online (leading to further declines in the cost of compute) or it will be the year when many of these construction projects begin to face delays. We already have seen a few of these delays publicly reported in Q4 2025. If hyperscalers begin to warehouse their new AI chips rather than installing them directly into data centers, this will be a telltale sign that the era of delays has begun.
The other way in which 2026 will be the “Year of Delays” has to do with the AGI timeline. For a long time, Silicon Valley luminaries were forecasting the imminent emergence of AGI, with “AGI in 2027” thrown around frequently in conversation. Since June of this year, there has been a progressive walk-back of this timeline. Dwarkesh Patel’s recent podcast interviews with Richard Sutton, Andrej Karpathy, and Ilya Sutskever are a demarcating line; the new consensus is that the AGI window will be in the 2030s, at earliest. In the coming year, I expect this “update” to filter outside of Silicon Valley. There are implications across many areas. The most notable risk is that hyperscaler CapEx today ends up being outdated.
Tale 2: The Relentless Drive Toward AI Adoption
The area where I do not expect to see any delays is in AI adoption itself. The fading of hype will have little impact on fundamentals. If anything, the best startups are growing faster than ever from $0 to $100M in revenue. In 2026, we’re going to see the emergence of a $0 to $1B club. The trend of the last three years—and likely for many more—is that startups are laying the foundation for the future economy, one building block at a time. There are many excellent entrepreneurs exploring new niches, and a lot of latent value has yet to be unlocked.
The best AI startups are moving with extreme efficiency—many are earning north of $1M in revenue per employee. This implies market pull vs. a push sale. Today’s entrepreneurs are building “self-improving” companies—they are themselves using AI agents for functions like legal, recruiting, and sales—creating an ecosystem flywheel effect. AI app companies are also riding a compute cost curve that should drive incremental margin improvement, especially as new data centers come online between now and 2030. Finally, with enterprises facing adoption fatigue on DIY implementations, startups are gaining even more momentum.
For some, AI adoption is happening too slowly. Those expecting a rapid AI takeoff would prefer to see a deus ex machina moment carry us straight to the finish line. I think that dream is likely to disappoint. Instead, the next leg of the AI story will require hard work, creative brilliance, and endurance to reach a new threshold where AI radically transforms the economy. We need only to look at the green shoots—founder motivation, aggressiveness, hunger to win, customer obsession—to see that this future is coming.

Israel said it launched an airstrike on a Hamas militant in southern Gaza late Wednesday in retaliation for an attack earlier in the day that wounded five Israeli soldiers.
The strike was the latest test for a fragile ceasefire that has mostly…

Don’t miss the final full moon of 2025, when the “Cold Moon” takes to the autumn sky at sunset on Dec. 5. Here’s how to watch the lunar spectacle unfold online from the comfort of your home thanks to some handy livestreams.
This month’s full moon…
JEDDAH, Saudi Arabia, Dec. 3 (Xinhua) — The “China Film Night in Jeddah” was held on Wednesday evening in Jeddah, Saudi Arabia, featuring a screening of the Chinese film The Lychee Road.
Co-hosted by the China Film Administration and the…

Google shares 10 of its favorite Chrome extensions from 2025.
Image: Google

When iOS 26 first dropped, there was one pretty huge group chat bug.
Some users complained that their messages to existing group chats were sent as individual chats instead of chats…

Sign up here.
Nvidia on Wednesday said that its latest AI server, which packs 72 of its leading chips into a single computer with speedy links between them, improved the performance of Moonshot’s Kimi K2 Thinking model by 10 times compared to the previous generation of Nvidia servers, a similar performance gain to what Nvidia has seen with DeepSeek’s models.
Nvidia said the gains primarily came from the sheer number of chips it can pack into servers and the fast links between them, an area where Nvidia still has advantages over its rivals.
Reporting by Stephen Nellis in San Francisco, editing by Deepa Babington
Our Standards: The Thomson Reuters Trust Principles.