To address future demand for semiconductors amid severe chip shortages of 2020 – 2022, all leading chipmakers announced plans to build new fabs and even disclosed their estimated costs. But spiraled inflation, caused by the disruption of supply chains by the pandemic and then by the Russian war against Ukraine, increased costs of fabs for Intel and Samsung by billions of dollars, according to reports.

When Intel announced plans to establish a new manufacturing site near Magdeburg, Germany, last year, it said that its first production fab and supporting facilities would require investments of $18.7 billion (€17 billion) and negotiated $7.2 billion of state aid. But because of high inflation, increasing costs of materials, and high energy prices, the company now believes that the initial investment would be around $31.675 billion (€30 billion). According to a Bloomberg report last week, it would need $4.223 billion – $5.279 billion (€4 billion – €5 billion) more state support. 

Intel confirmed that it was re-negotiating the support package with the German authorities because of increased fab costs, but they did not confirm the exact sums it sought.

"Disruptions in the global economy have resulted in increased costs, from construction materials to energy," a statement by Intel reads. "We appreciate the constructive dialogue with the federal government to address the cost gap with building in other locations and make this project globally competitive."

When completed later this decade, Intel's fab in Germany will be one of the most advanced semiconductor facilities in the world. Given the timeframe for starting production, it will likely use sub 1.8nm (post Intel 18A) fabrication processes to make chips for Intel and its customers of its Intel Foundry Service division. 

Intel is not the only company to suffer from higher-than-expected fab costs. As it turns out, Samsung estimates that its initial investments in its upcoming fab near Taylor, Texas, will total over $25 billion, up more than $8 billion from initial forecasts, according to a Reuters report that cites three people with knowledge of the matter. 

While wafer fab equipment accounts for the lion's share of fab costs and these tools are gradually getting more expensive, construction cost was the main reason the Taylor, Texas, fab got more expensive. Meanwhile, Samsung wants to build the fab sooner rather than later since it expects further cost increases.

"The higher construction cost is about 80% of the cost increase," one of Reuters's sources is reported to have said. "The materials have gotten more expensive," the source added.

Samsung is looking forward to completing the construction of its fab in Taylor, Texas, in late 2023 or early 2024. After it moves into the production tools, it will start making chips at the production facility in 2024 – 2025, presumably using its 3nm and 4nm-class process technologies.

Comments Locked

15 Comments

View All Comments

  • mode_13h - Thursday, March 16, 2023 - link

    Oh dear. This is going to push up wafer costs even further. People still expecting GPUs to return to pre-pandemic price scales won't be happy about this.

    With more expensive wafers exerting further downward pressure on die sizes, we could see core counts start to level off and per-core generational performance improvements dropping back down into the single-digit %'s.

    The future of computing feels hot & loud, as we increasingly resort to frequency as a means of extracting every last drop of performance from ever smaller and more expensive chips.
  • ballsystemlord - Thursday, March 16, 2023 - link

    Haven't we been having "hotter and louder chips as we increasingly resort to frequency as a means of extracting every last drop of performance from ever smaller and more expensive chips," since CPUs first existed? The 8080 was 2Mhz.
  • ingwe - Thursday, March 16, 2023 - link

    To some extent I agree. But we also saw "good enough" performance allow for different form factors. I'm thinking of smartphones, tablets, but also hybrids (thinking of the surface that I daily drive). So while high performance desktop power has increased, it feels like power consumption has decreased (or at least efficiency has increased--thinking of race to idle).
  • mode_13h - Friday, March 17, 2023 - link

    The "race to idle" hypothesis is a fallacy. Chips & Cheese looked at the Joules consumed during completion of two workload classes, across multiple different types of cores and clock speeds.

    https://chipsandcheese.com/2022/01/28/alder-lakes-...

    The faster cores nearly always used more power, except when the E-cores were at the edge of their operating window. And, except for a couple slight bumps, all cores used more power to complete the workload faster. The other exception to that was at the very bottom of the frequency range, where it seems you need to run fast enough to overcome what are probably some fixed overheads.

    I think the "race to idle" notion probably dates back to an era when CPUs weren't nearly as good at power-scaling.
  • Zoolook - Friday, March 17, 2023 - link

    It only shows it's a hit and miss on Alder Lake, which was their first beta of the new architecture, it's been shown to be true on other architectures.
  • mode_13h - Friday, March 17, 2023 - link

    No, look more closely. It's a bit confusing because the article contains a mix of different metrics - both performance and energy. The "race to idle" question is addressed by the energy graphs, so make sure you're looking at the plots labeling the Y-axis is in Joules.

    Now that we're looking at the right plots, you can see that not only are the Gracemont and Golden Cove cores benchmarked, but also Skylake, and two different incarnations of Zen 2. In every case, we see higher clockspeeds consistently resulting in consuming more energy to complete the workload, with one notable exception.

    The notable exception is the chiplet-based of Zen 2 CPU, which has a very slightly negative slope before it turns sharply upward. Because the monolithic Zen 2 APU doesn't exhibit the same properties, we can say this is more likely an artifact of the chiplet implementation than anything about the Zen 2 microarchitecture.

    Let's put the shoe on the other foot: how about you show us a modern CPU where "race to idle" *does* apply. I doubt you can.
  • mode_13h - Friday, March 17, 2023 - link

    > Haven't we been having "hotter and louder chips

    For about a decade, Intel's CPUs had pretty monotonic generational decreases in TDPs. Once Zen launched, Intel reversed course and each new generation has been at least as hot as the one prior.

    If dies become smaller and more expensive, that only increases the incentives to juice them harder. Even "efficiency" cores will tend to be optimized more for area-efficiency than power-efficiency
  • Threska - Thursday, March 16, 2023 - link

    Only those on the left side of the demand curve aka latest and greatest. For everyone else there's enough in the pipeline, new and used to keep people happy.
  • mode_13h - Friday, March 17, 2023 - link

    It seems to me that *everything* will get more expensive. The cutting-edge is the most affected, but even mid-range, trailing-edge, and the used market won't be unaffected.
  • AnnonymousCoward - Monday, March 27, 2023 - link

    Let's all ignore the multiple trillions of dollars the Democrats pumped into the economy including handing out thousands of dollars to most Americans in early 2021! The effects this would have were obvious at the time (especially if you listen to Ben Shapiro's podcast!). And here we are 2 years later, and articles just point to Russia and supply chains.

Log in

Don't have an account? Sign up now