Are you ready for shape-shifting apps?

With a 60% surge in App Store submissions as developers embrace vibe coding and AI-assisted development tools, Apple’s App Store team has identified an emerging security challenge: what happens when an app you download later evolves into something fundamentally different — without Apple having a chance to review those changes. 

Vibe coding the new attack surface

Let’s say you install a simple chess app, only to discover later that it has updated itself into something different, or that it’s downloaded external code that modifies or adds to what it does after installation. Some experts already expect as much as 30% of new security exposures to be generated by hastily made vibe-coded apps. That might turn into an even bigger risk as Apple is forced to support app sideloading in some markets.

Theoretical threat?

The deeper risk is that legitimate‑seeming apps could introduce unverified, remotely delivered code after installation. This is a known malware pattern; one historic example is XcodeGhost, a compromised version of Apple’s Xcode development environment that infected apps built with it. More recently, CovertLabs identified 198 iOS AI apps leaking user chat history and private data. Even today, news of the DarkSword iOS exploit shows that hackers see Apple’s platforms as high-value targets, which means any potential security flaw will be explored and, if possible, exploited.

Apple protects

Apple’s latest response to this threat appears in an updated set of App Store guidelines first noted by The Information. Reportedly, Apple is pushing back on “vibe coding” platforms such as Replit and Vibecode, arguing that they violate long‑standing rules prohibiting apps from running code that can alter how other apps behave. 

The aim isn’t to inhibit vibe coding per se, MacRumors tells us; Apple particularly objects to tools that display newly created apps inside an embedded web view within the app. 

Instead, Apple wants these previews opened in an external browser, which prevents app‑within‑app execution that could circumvent review. I imagine the move to open in an external browser would also place that app behavior in the more protected Safari sandbox security, which restricts what such apps can do to your system. Some functions might fail, but permissions would remain secure.

Dynamic code and threat delivery

It’s tempting to think the concern is that these apps could bypass Apple’s commission structure. But Apple itself says the motivation is security — preventing apps from fundamentally changing their behavior without review.

“Vibe-coding” tools allow users to write, generate, or modify code dynamically in‑app; in doing so, they create a scenario where apps built using them can evolve into something different after installation. While this is useful for education and experimentation, it also creates a series of potential security vulnerabilities. For instance, if a malicious actor compromised a vibe‑coding platform, they could push harmful updates to apps developed within it.

Apple’s developer guidelines already try to address this issue:

“Apps should be self‑contained in their bundles and may not read or write data outside the designated container area, nor may they download, install, or execute code which introduces or changes features or functionality of the app, including other apps. Educational apps designed to teach or allow students to test executable code may, in limited circumstances, download code, provided that such code is not used for other purposes and is fully viewable and editable by the user.”

To my mind, the intent is to mitigate against Trojan‑horse‑style attacks that might be enabled by unreviewed code execution.

Apple’s App Store cannot become a backdoor

In the end, despite the fact that Apple does now support GenAI-boosted workflows in Xcode, it does not want the App Store to become a conduit for the distribution of apps that can fetch or execute unreviewed code after approval. After all, if every app did this, the value of app curation is itself reduced. While it might be that other app marketplaces choose to allow such flexibility (good luck with that), Apple has no intention of permitting the App Store to serve that role.

In November, Apple strengthened its App Review guidelines with a new rule to prevent app impersonation. “You cannot use another developer’s icon, brand, or product name in your app’s icon or name without approval,” that rule said. 

Apple’s concern is that dynamic code‑generation tools make it easier for developers (or attackers) to build, deploy, and ship copycat apps or apps that can be updated using unknown tools, frameworks, or remote code after installation. Generative AI (genAI) further accelerates this risk by making it trivial to produce complex code quickly. This is certainly contributing to the roughly 2.28 million apps now available at the App Store, which is up by 160,000 from the year before.

Fear, uncertainty, doubt — and opportunity

The threat the App Store team wants to protect us from is a natural extension of the proliferating AI-driven challenges we are already experiencing in our lives. Just as we now scrutinize AI‑generated images of world leaders in coffee shops for tell‑tale signs of extra fingers, we may soon need to question whether the apps on our devices are truly what they claim to be. Plus, of course, as criminals identify common code signatures in vibe‑generated apps, they may yet identify attractive new attack surfaces no one else has come across yet.

That’s not hyperbole — we know we live in interesting times. There is no doubt that when it comes to apps doing things on our digital devices that contain so much of our own personal data, it’s far better to play safe and deploy tactically than move fast and break more of the few remaining things we have left.

You can follow me on social media! Join me on BlueSky,  LinkedIn, and Mastodon.

Read more: Are you ready for shape-shifting apps?

Story added 18. March 2026, content source with full text you can find at link above.