AI Hallucinated Imports: When npm Packages Don't Actually Exist - VibeDoctor 
← All Articles 🧹 Code Quality High

AI Hallucinated Imports: When npm Packages Don't Actually Exist

AI code generators invent package names that don't exist on npm. Learn how hallucinated imports break builds and how to detect them.

HALLU

Quick Answer

AI code generators sometimes invent npm package names that do not exist on the npm registry. When you run npm install, the install fails or - more dangerously - a real package with that name exists but does something completely different. Scanning your package.json and import statements against the live npm registry before deployment is the only reliable way to catch this.

What Are Hallucinated npm Imports?

Large language models are trained on billions of lines of code, but that training data has a knowledge cutoff. Packages get renamed, deprecated, or unpublished. New utility patterns emerge and AI models construct plausible-sounding package names that happen not to exist. This is called a hallucinated import.

The problem shows up in two forms. First, the package simply does not exist on npm, so npm install throws a 404 and your CI pipeline fails. Second - and far more dangerous - the package name exists but is owned by someone else, and that someone may have published a malicious payload. This second scenario is the basis for dependency confusion attacks, a class of supply chain exploit that has affected hundreds of companies.

According to GitHub's 2024 State of the Octoverse report, supply chain attacks targeting the npm ecosystem increased 74% year-over-year, and dependency confusion and typosquatting are among the top three vectors. When AI code introduces a fictitious package name, any attacker who registers that name on npm gains an execution vector into your codebase.

Vibe coding tools - Bolt, Lovable, Cursor, and v0 - are especially prone to this because they generate complete files without running npm install themselves. The output looks syntactically correct, the import line looks reasonable, and nothing fails until you actually try to build.

Why AI Models Invent Package Names

The root cause is how language models generate code. They predict the next most probable token. If a codebase they trained on used a certain import pattern, and a similarly-named real package existed at that time, the model will generate that import even if the package was later renamed, split, or deleted.

Common hallucination patterns include:

Stanford HAI's 2024 evaluation of code generation models found that hallucinated package references appear in roughly 5.2% of generated JavaScript files, with the rate rising to 9.7% for less common utility categories like date formatting, currency, and validation.

The Risk: From Build Failure to Supply Chain Attack

Most developers discover hallucinated imports only when a build fails. This is the good outcome. The serious risk is the gap between when AI generates the code and when someone notices.

If you commit a project to a public GitHub repository with a hallucinated package name in package.json, an attacker who monitors npm for new package registrations or scans public repos for unregistered names can register that name with a postinstall script that exfiltrates environment variables. The next time any contributor runs npm install, the malicious package executes.

This is not hypothetical. In 2021, security researcher Alex Birsan demonstrated the dependency confusion attack against major tech companies including Apple, Microsoft, and PayPal. Hundreds of their internal package names had been exposed in public code. The research earned him over $130,000 in bug bounties. AI code generators create the same exposure vector every time they invent a package name.

What Hallucinated Code Looks Like

Here is a realistic example. You ask Cursor to add image optimization to a Next.js project:

// ❌ BAD - Hallucinated package name
import { optimizeImage, resizeToWebp } from 'next-image-optimizer-utils';
import { createBlurHash } from 'blurhash-generator-next';

export async function processUploadedImage(file: File) {
  const optimized = await optimizeImage(file, { quality: 80 });
  const blur = await createBlurHash(optimized);
  return { optimized, blur };
}

Neither next-image-optimizer-utils nor blurhash-generator-next exist on npm. The real packages are sharp for image processing and blurhash for blur hash generation. The AI invented compound names that sound reasonable but do not map to real packages.

// ✅ GOOD - Verified real packages
import sharp from 'sharp';                  // npm: sharp (real)
import { encode } from 'blurhash';          // npm: blurhash (real)
import { getPlaiceholder } from 'plaiceholder'; // npm: plaiceholder (real)

export async function processUploadedImage(buffer: Buffer) {
  const optimized = await sharp(buffer)
    .webp({ quality: 80 })
    .toBuffer();

  const { base64 } = await getPlaiceholder(buffer);
  return { optimized, blurDataURL: base64 };
}

The fix is to verify every package name against the npm registry before using it. Run npm info <package-name> in your terminal, or check npmjs.com directly.

How to Detect Hallucinated Imports Before They Reach Production

Manual verification works for small projects, but does not scale. Here is a systematic approach:

  1. Run npm install immediately after AI generates code. If a package does not exist, you will see an error like npm error 404 Not Found - GET https://registry.npmjs.org/<package>. Do not commit until this passes.
  2. Compare imports to package.json: Every import ... from 'x' where x is not a relative path should have a corresponding entry in dependencies or devDependencies. Orphaned imports are a red flag.
  3. Use depcheck: The depcheck CLI finds unused and missing dependencies. It will flag imports that have no corresponding installed package.
  4. Automated scanning: Tools like VibeDoctor (vibedoctor.io) automatically scan your codebase for hallucinated imports and flag specific file paths and line numbers. Free to sign up.
  5. Lock your registry: Add a .npmrc with audit=true and consider using a private registry proxy like Verdaccio or Artifactory, which will reject package names not in your allowlist.

According to Veracode's 2024 State of Software Security report, 68% of applications have at least one open-source vulnerability introduced through a dependency. Hallucinated imports that get registered as malicious packages are a direct path into that statistic.

Comparison: AI Tools and Their Hallucination Risk

Tool Code Generation Style Hallucination Risk Mitigation
Cursor In-editor, context-aware Medium - uses open files as context Runs in your local env; install fails fast
Bolt Full project generation High - generates package.json from scratch Preview runs npm install; errors visible
Lovable Full project generation High - similar to Bolt Build log shows failed installs
v0 (Vercel) Component snippets Medium - typically scoped to UI libraries Review shadcn/ui and Radix imports carefully
GitHub Copilot Autocomplete Low-Medium - context from workspace Workspace context reduces but does not eliminate risk

FAQ

Can a hallucinated package name become a real security threat?

Yes. If an AI generates an import for a package name that does not exist and you commit that to a public repository, any attacker can register that name on npm with a malicious postinstall script. The next npm install anyone runs will execute that script. This is the dependency confusion attack vector.

How is this different from a typosquatting attack?

Typosquatting involves an attacker registering a slightly misspelled version of a popular package (e.g., lodahs instead of lodash). Hallucinated imports are different: the AI invents a completely new name that sounds plausible, which is typically not yet registered at all. Both create risk, but hallucinated imports create a race condition between discovery and registration.

Does npm audit catch hallucinated packages?

No. npm audit checks installed packages against a known vulnerability database. If the package does not exist and was never installed, audit has nothing to scan. You need to catch hallucinated imports before or during the install step.

Are scoped packages safer from hallucination?

Scoped packages (those starting with @org/) are somewhat safer because an attacker would need to own or create the organization scope on npm to register a malicious version. However, AI still hallucinates within scopes - generating @tanstack/query-utils instead of the real @tanstack/react-query, for example.

What is the HALLU check and what does it detect?

VibeDoctor's HALLU check cross-references every import and require statement in your JavaScript and TypeScript files against the npm registry. It flags any package name that resolves to a 404, a deprecated package that has been renamed, or a package that is not listed in your package.json. Results include the exact file path and line number.

Scan your codebase for this issue - free

VibeDoctor checks for HALLU and 128 other issues across 15 diagnostic areas.

SCAN MY APP →
← Back to all articles View all 129+ checks →