
The recent wave of American tariffs on allies and adversaries alike has fractured the global economy to a degree we haven’t seen in decades. While the tariffs have, for now, only targeted goods, for multinational companies, this recent round of trade brinksmanship reflects a broader, sustained shift towards greater complexity and volatility in their global operations.
This splintering has also prompted the emergence of new geographic boundaries in the digital economy. The result is that just as technology has assumed a central role, touching nearly every aspect of a company’s operations, geopolitical tensions are limiting which technologies international firms can use and how they can use them, threatening to wipe out the gains from increasing technological integration.
This is a dramatic U-turn from the longstanding prevailing corporate belief of a continued expansion of the transnational digital economy. That belief drove multinationals to centralize their tech stacks as they banked on their continued ability, for example, to dispatch top tech staff across the globe as they saw fit, and to transmit data freely across borders to benefit from their scale.
This new market environment, however, has upended that assumption and elevated IT resilience from what was once an operational concern to a strategic imperative. Simply keeping up with technological change to avoid ending up as “roadkill on the information highway” is not enough anymore. Today, companies must compete and innovate while also tiptoeing around the widening geopolitical cracks of the global economy.
Navigating this new, fractured global economy requires a different approach for multinational players that starts with finding a strategic balance between operational resilience and flexibility and operational efficiency. Building resilience and proactively mitigating risk through “just-in-case” operations can be costly, but centralized tech stacks and operations are now an important driver of enterprise risk.
To strike that balance, multinationals should assess their exposure to geopolitical disruption by focusing on two core variables: the regional footprint of their tech operations and the layers of the tech stack that they are most reliant upon. To illustrate this balance, we analyzed the impact of geopolitical fragmentation on the GenAI tech stack of a European automotive supplier operating globally, including in China, the U.S., and the European Union.
Dissecting your tech stack
The first step executives must take to prepare for a geopolitically and economically divided world is to conduct a company tech stack audit to assess the exposure to disruption at each layer: hardware, data, and software (models) used.
Volatility will impact companies and their tech stack differently, and starting with an overarching assessment enables a company to develop a flexible strategy tailored to its own specific operational, geographic, and industry needs. New data localization regulation prohibiting the transfer of certain types of data outside of a country, for instance, will have a relatively small potential impact on a hypothetical construction company, even one that works in many countries, compared to its impact on a global consumer goods business, like Shein or Zara, that relies on scaling its data and analytics across markets to be successful. Identifying one’s true exposure requires looking independently at each of the layers of the tech stack before considering how they interact.
Hardware and cloud platforms
Any audit should begin by examining hardware access, the foundation of the tech stack.
Microchips, as we’ve witnessed in recent U.S.-China trade relations, are particularly exposed to geopolitical tension because of their centrality to success in the digital economy. Restrictions affecting access to computing power can significantly impact companies that rely on GenAI models as part of their operations, especially when those models require localized data processing.
For our automotive supplier, for instance, which does not rely heavily on GenAI tools, restrictions on where it can access a specific type of chip should not be too disruptive compared to companies, like banks or online retailers, that rely on certain chips that provide the compute necessary to run models at scale to respond to customer queries.
When data localization laws force companies to run models inside specific countries, the availability of domestic cloud and hardware infrastructure may become a binding constraint. The same is true for companies that need to keep their models’ processing time to a minimum. To maintain low latency, they need to limit the distance between where model inference requests are made and where the inference itself is run. Even if not legally bound to run models in Africa, for example, a multinational that aims to limit latency for customer-facing, GenAI-powered interactions would want to be able to access computing power nearby. But local options may be limited, due to caps on African imports of cutting-edge chips imposed by the U.S. AI Diffusion Framework.
Most companies are heavily reliant on just a few suppliers for computing power. What happens when that power is walled off due to geography? The best mitigation strategy is to proactively build a portfolio approach to cloud platforms and on-premises computing capacity, as well as stockpiling additional chips or keeping excess computing power capacity available. In cases where that’s not feasible, firms should prepare to rapidly tailor their operations and products to comply with evolving regional constraints.
Data and data platforms
Multinational boards and executives must also consider their data flows. To stay competitive, multinationals need to learn to effectively use their data at scale to fine-tune models and their applications. The scaling of data most likely involves cross-border flows, a process that is complicated by the fact that some 80% of global data is subject to transnational mobility restrictions.
But not all data is treated equally, leaving certain sectors more vulnerable to data geofencing than others. Manufacturing data, for example, is often less regulated than personal data. For our automotive supplier, manufacturing data stored locally near its production sites is likely more important to optimizing its operations and production than customer data. As a result, data flow restrictions may not be particularly problematic.
The financial sector, on the other hand, is one the most regulated industries due to the highly confidential nature of the (often personal) data it collects. In 2021, American Express and Mastercard were forbidden from onboarding new customers in India due to non-compliance with payment storage data laws. The companies were eventually allowed to resume operations after fulfilling compliance requirements, including localizing some operations in India and hiring Indian nationals. These obligations, however, had a cost: eroding synergies. Retailers and social media companies, like Amazon and Meta, that rely heavily on the scaling and use of personal data, will need to be particularly aware of the potential for increased regulatory pressure.
Multinationals of all stripes need to evaluate just how important the free flow and centralization of data is to their competitive advantage. To start this process, companies will first need to properly label and track their own data so they can respond quickly to regulatory changes—a move that might seem obvious, but is easier said than done.
More importantly, firms should seriously explore ways to transform their data so as to remain compliant yet able to extract the value of cross-border aggregation and analytics. As we’ve argued before, this is often possible because while regulation typically targets raw data, companies can get much of the benefit of data centralization through synthetic data, model features, and embeddings—all forms of data transformation that protect confidentiality but can still hold the requisite insights.
Foundation models, fine-tuned models, and applications
Finally, multinationals must be aware of how geopolitical restrictions can directly affect access to certain foundation models and apps that their employees and customers use on a day-to-day basis. Two recent examples: OpenAI’s departure from China and the ban on DeepSeek in Italy.
This is not a big concern for most businesses, thanks to the commoditization of foundation models, and the reduction of the performance gap between closed and open-source models. Most companies can effectively mitigate their exposure with strategic modularization and the use of model-agnostic platforms and applications. It is important to note, however, that this mitigation strategy may negatively impact performance in certain areas, like coding, where specific models have differentiated capabilities.
The companies that embed LLMs into their products face greater risk of disruption. In those cases, being proactive is key. For example, it has been widely reported that Apple has tailored its AI partnership strategy based on the geography being served. In China, Apple procures the AI model embedded in its iPhones from Alibaba. Elsewhere, the company uses a combination of its proprietary Apple Intelligence and OpenAI’s ChatGPT.
International firms should also examine to what extent the apps they use in their operations are governed by different regulatory regimes. The EU’s AI Act, for example, classifies as high-risk the use of AI systems to evaluate personal creditworthiness or to price individual life and health insurance. In the U.S., however, there is no such restriction. A financial services company would likely need different strategies for its GenAI model use in the U.S. and the EU.
The best way for companies to avoid disruptions is to use and build model-agnostic applications and prioritize a portfolio approach (including open-source models) to tailor their operations and products to regional contexts. For companies using models for specific use cases, this also means that they should identify alternatives through the creation of their own performance benchmarks.
Right-sizing the response
Multinationals must recognize the serious threat currently posed by the fracturing of the digital space, particularly to those companies that have gained a competitive edge through a centralized tech stack. The erosion of their competitive advantage due to geopolitical fragmentation will open the door for more agile, regional firms to regain market share.
In the most extreme cases, this new landscape could force multinationals to effectively become global holdings, piloting separate, geographically dispersed companies. For many others, the extent of this shift will not be as existential. A grounded, pragmatic approach is required to assess the value derived from technology, and to right-size preparedness. The assessment must be done for each individual layer of the tech stack, and across all geographies. Only then can firms formulate the right response.
Read other Fortune columns by François Candelon.
François Candelon is a partner at private equity firm Seven2 and the former global director of the BCG Henderson Institute
Etienne Cavin is a consultant at Boston Consulting Group and an ambassador at the BCG Henderson Institute.
Leonid Zhukov is the director of the BCG X AI Science Institute and is based in BCG’s Dubai office.
David Zuluaga Martínez is a partner at Boston Consulting Group and an Ambassador at the BCG Henderson Institute.
Some of the companies mentioned in this column are past or present clients of the authors’ employers.
This story was originally featured on Fortune.com