VoidZero https://voidzero.dev Mon, 29 Dec 2025 00:04:19 GMT https://validator.w3.org/feed/docs/rss2.html https://github.com/jpmonette/feed en VoidZero https://voidzero.dev/icon-white.png https://voidzero.dev Copyright (c) 2024-present, VoidZero Inc. <![CDATA[Announcing Oxlint Type-Aware Linting Alpha]]> https://voidzero.dev/posts/announcing-oxlint-type-aware-linting-alpha https://voidzero.dev/posts/announcing-oxlint-type-aware-linting-alpha Mon, 08 Dec 2025 00:00:00 GMT

For more technical details, implementations, and considerations for Oxlint's type-aware linting, see the blog post on the Oxc website.

TL;DR: Oxlint's type-aware linting has reached alpha status. Type-aware rules can now be manually configured, disabled with comments, and fixed automatically. This milestone also includes more rule coverage, TypeScript diagnostics reporting, and type-checking while linting.

]]>

For more technical details, implementations, and considerations for Oxlint's type-aware linting, see the blog post on the Oxc website.

TL;DR: Oxlint's type-aware linting has reached alpha status. Type-aware rules can now be manually configured, disabled with comments, and fixed automatically. This milestone also includes more rule coverage, TypeScript diagnostics reporting, and type-checking while linting.


Less than 6 months after the technical preview release, Oxlint's type-aware linting has reached alpha status. The goal for the technical preview was to build a proof of concept for type-aware linting in Oxlint. VoidZero's goal for the alpha release is to better integrate Oxlint with tsgolint, which powers our type-aware lint rules.

What is tsgolint?

Type-aware rules like no-floating-promises can slow down linting because TypeScript may need to check every file to infer types. VoidZero accelerates this process using tsgolint to perform the underlying type checks and type-aware linting for Oxlint.

tsgolint is a high-performant linter backend on top of typescript-go, the Go port of TypeScript. Originally a proof-of-concept by auvred (typescript-eslint team), it's now maintained by VoidZero. Since typescript-go doesn't export internal packages, tsgolint provides a shim, an adapter layer on top of typescript-go that makes these internal APIs accessible. All type-aware rules are written directly against these shimmed internal APIs, keeping Oxlint fast while leveraging TypeScript's type system. tsgolint is not meant to run standalone. It is designed to be used as a backend for linters like Oxlint, providing type-aware linting capabilities.

Preliminary results show that Oxlint + tsgolint is significantly faster than ESLint + typescript-eslint (source), given the same ruleset:

  • 8x faster in vuejs/core
  • 12x faster in outline/outline

However, a major consideration for this approach is maintenance and staying up-to-date with typescript-go. We regularly update tsgolint to depend on the latest typescript-go release and address any changes. While working on tsgolint, our team also submits pull requests to improve typescript-go and the overall ecosystem.

Flow chart showing how developer uses Oxlint with `tsgolint` for type-aware linting
Oxlint is the developer's "frontend" interface that calls tsgolint as "linter backend" for type-aware linting.

Better tsgolint integration with Oxlint

The goal for the alpha status was to reduce the distinction between type-aware rules and non-type-aware rules when using Oxlint.

  1. Rule configurations. Type-aware rules can be separately configured in .oxlintrc.json
json
// .oxlintrc.json
{
  "rules": {
    "typescript/no-floating-promises": [
      "error",
      {
        "ignoreVoid": true,
        "allowForKnownSafePromises": [
          { "from": "file", "name": "SafePromise" },
          { "from": "lib", "name": "PromiseLike" }
        ]
      }
    ]
  }
}
  1. In-line disable comment support. Type-aware rules can be disabled per file or line.
js
/* oxlint-disable typescript/no-floating-promises */

// oxlint-disable-next-line typescript/no-floating-promises
[1, 2, 3].map(async x => x + 1);
  1. Automatic fixes. Type-aware rules now support automatic fixes using the --fix flag.

All 3 updates bring type-aware rule features closer to parity with non-type-aware rules.

Additional improvements

In addition to more stability, Oxlint's type-aware linting alpha comes with:

  • Type-checking while linting. Oxlint can emit type checking errors from TypeScript while linting. Meaning, in some cases, it's possible to skip doing a separate type-check command altogether (e.g., tsc --noEmit) and reduce total time spent linting and type-checking in CI. You can enable it via the --type-check flag.
  • More supported rules. Oxlint added support for no-deprecated, prefer-includes , strict-boolean-expressions rules. It now covers 43 / 59 typescript-eslint rules. All supported rules.
  • TypeScript diagnostics. Oxlint reports any TypeScript compiler issues or configuration issues in tsconfig.json files.

Next steps

VoidZero is actively working on the following improvements for the beta release:

  • Increase type-aware rules support. Expand coverage to the remaining typescript-eslint rules.
  • Memory usage optimization. In very large monorepos, tsgolint may encounter out-of-memory issues. Our team is working improving memory usage for the next milestone. If you are having memory issues please report it in the tsgolint repository.

Connect with us:

]]>
<![CDATA[What’s New in ViteLand: November 2025 Recap]]> https://voidzero.dev/posts/whats-new-nov-2025 https://voidzero.dev/posts/whats-new-nov-2025 Thu, 04 Dec 2025 00:00:00 GMT Welcome to another edition of What’s new in ViteLand!

Regularly, we recap the project updates for Vite+, Vite, Vitest, Rolldown, Oxc, and what’s happening in our community.

Vite 8 Beta: The Rolldown-powered Vite

We are excited to announce that Vite 8 beta is now available! Vite now uses Rolldown as its bundler, replacing the previous combination of esbuild and Rollup. Rolldown is VoidZero's new Rust-based bundler that is designed for Vite use cases and brings significant performance improvement.

However, the impact of Vite’s bundler swap goes beyond performance. Rolldown uses Oxc, another project led by VoidZero, for parsing, resolving, transforming, and minifying. That makes Vite the entry point to an end-to-end toolchain maintained by the same team: The build tool (Vite), the bundler (Rolldown), and the compiler (Oxc).

This alignment brings consistent behavior, faster adoption of changes like new language features, and improvements like better tree-shaking and chunking that wouldn't be doable in Vite directly.

Want to know more details? Read our announcement blog post about the new beta release!

Project Updates

Vite+

Vite

  • Vite 7.2 was released earlier this month, bringing several smaller features and bug fixes in further patch versions.

Vitest

  • Vitest now has a diff slider and tabbed view for visual regression testing results! This makes it easier to compare visual changes side by side.
  • A new experimental file-system based cache is available in Vitest to speed up subsequent test runs by caching transformed modules on disk. It also works in the CI!
  • With Vitest 4, a new standard schema matching API was published, so you can now match against schemas of libraries like Zod, Valibot, Arktype, or Yup directly in your tests.
  • Vitest introduced Imports Breakdown, another experimental feature in version 4.0.15. Now you see how long each module took to load, in the Vitest UI, the VS Code extension and your terminal.

Rolldown

Oxc

From The Community

]]>
<![CDATA[Vite 8 Beta: The Rolldown-powered Vite]]> https://voidzero.dev/posts/announcing-vite-8-beta https://voidzero.dev/posts/announcing-vite-8-beta Wed, 03 Dec 2025 00:00:00 GMT

This post was originally published on the Vite blog

TL;DR: The first beta of Vite 8, powered by Rolldown, is now available. Vite 8 ships significantly faster production builds and unlocks future improvement possibilities. You can try the new release by upgrading vite to version 8.0.0-beta.0 and reading the migration guide.

]]>

This post was originally published on the Vite blog

TL;DR: The first beta of Vite 8, powered by Rolldown, is now available. Vite 8 ships significantly faster production builds and unlocks future improvement possibilities. You can try the new release by upgrading vite to version 8.0.0-beta.0 and reading the migration guide.


We’re excited to release the first beta of Vite 8. This release marks a major milestone towards our goal of a unified JavaScript toolchain. Vite now uses Rolldown as its bundler, replacing the previous combination of esbuild and Rollup.

A new bundler for the web

Vite previously relied on two bundlers to meet differing requirements for development and production builds:

  1. esbuild for fast compilation during development
  2. Rollup for bundling, chunking, and optimizing production builds

This approach lets Vite focus on developer experience and orchestration instead of reinventing parsing and bundling. However, maintaining two separate bundling pipelines introduced inconsistencies: separate transformation pipelines, different plugin systems, and a growing amount of glue code to keep bundling behavior aligned between development and production.

To solve this, we built Rolldown, our next-generation bundler designed for:

  • Performance: Rolldown is written in Rust and operates at native speed. It matches esbuild’s performance level and is 10–30× faster than Rollup.
  • Compatibility: Rolldown supports the same plugin API as Rollup and Vite. Most Vite plugins work out of the box with Vite 8.
  • More Features: Rolldown unlocks more advanced features for Vite, including full bundle mode, more flexible chunk split control, module-level persistent cache, Module Federation, and more.

Unifying the toolchain

The impact of Vite’s bundler swap goes beyond performance. Bundlers leverage parsers, resolvers, transformers, and minifiers. Rolldown uses Oxc, another project led by VoidZero, for these purposes.

That makes Vite the entry point to an end-to-end toolchain maintained by the same team: The build tool (Vite), the bundler (Rolldown), and the compiler (Oxc).

This alignment ensures behavior consistency across the stack and allows us to rapidly adopt and align with new language specifications as JavaScript continues to evolve. It also unlocks a wide range of improvements that previously couldn’t be done by Vite alone. For example, we can leverage Oxc’s semantic analysis to perform better tree-shaking in Rolldown.

How Vite migrated to Rolldown

The migration to a Rolldown-powered Vite is a foundational change. Therefore, our team took deliberate steps to implement it without sacrificing stability or ecosystem compatibility.

First, a separate rolldown-vite package was released as a technical preview. This allowed us to work with early adopters without affecting the stable version of Vite. Early adopters benefited from Rolldown’s performance gains while providing valuable feedback. Highlights:

  • Linear’s production build times were reduced from 46s to 6s
  • Ramp reduced their build time by 57%
  • Mercedes-Benz.io cut their build time down by up to 38%
  • Beehiiv reduced their build time by 64%

Next, we set up a test suite for validating key Vite plugins against rolldown-vite. This CI job helped us catch regressions and compatibility issues early, especially for frameworks and meta-frameworks such as SvelteKit, react-router and Storybook.

Lastly, we built a compatibility layer to help migrate developers from Rollup and esbuild options to the corresponding Rolldown options.

As a result, there is a smooth migration path to Vite 8 for everyone.

Migrating to Vite 8 Beta

Since Vite 8 touches the core build behavior, we focused on keeping the configuration API and plugin hooks unchanged. We created a migration guide to help you upgrade.

There are two available upgrade paths:

  1. Direct Upgrade: Update vite in package.json and run the usual dev and build commands.
  2. Gradual Migration: Migrate from Vite 7 to the rolldown-vite package, and then to Vite 8. This allows you to identify incompatibilities or issues isolated to Rolldown without other changes to Vite. (Recommended for larger or complex projects)

IMPORTANT

If you are relying on specific Rollup or esbuild options, you might need to make some adjustments to your Vite config. Please refer to the migration guide for detailed instructions and examples. As with all non-stable, major releases, thorough testing is recommended after upgrading to ensure everything works as expected. Please make sure to report any issues.

If you use a framework or tool that uses Vite as dependency, for example Astro, Nuxt, or Vitest, you have to override the vite dependency in your package.json, which works slightly different depending on your package manager:

json
{
  "overrides": {
    "vite": "8.0.0-beta.0"
  }
}
json
{
  "resolutions": {
    "vite": "8.0.0-beta.0"
  }
}
json
{
  "pnpm": {
    "overrides": {
      "vite": "8.0.0-beta.0"
    }
  }
}
json
{
  "overrides": {
    "vite": "8.0.0-beta.0"
  }
}

After adding these overrides, reinstall your dependencies and start your development server or build your project as usual.

Additional Features in Vite 8

In addition to shipping with Rolldown, Vite 8 comes with:

  • Built-in tsconfig paths support: Developers can enable it by setting resolve.tsconfigPaths to true. This feature has a small performance cost and is not enabled by default.
  • emitDecoratorMetadata support: Vite 8 now has built-in automatic support for TypeScript’s emitDecoratorMetadata option.

Looking Ahead

Speed has always been a defining feature for Vite. The integration with Rolldown and, by extension, Oxc means JavaScript developers benefit from Rust’s speed. Upgrading to Vite 8 should result in performance gains simply from using Rust.

We are also excited to ship Vite’s Full Bundle Mode soon, which drastically improves Vite’s dev server speed for large projects. Preliminary results show 3× faster dev server startup, 40% faster full reloads, and 10× fewer network requests.

Another defining Vite feature is the plugin ecosystem. We want JavaScript developers to continue extending and customizing Vite in JavaScript, the language they’re familiar with, while benefiting from Rust’s performance gains. Our team is working to accelerate JavaScript plugin usage in these Rust-based systems.

Upcoming optimizations that are currently experimental:

Connect with us

If you’ve tried Vite 8 beta, then we’d love to hear your feedback! Please report any issues or share your experience:

We appreciate all reports and reproduction cases. They help guide us towards the release of a stable 8.0.0.

]]>
<![CDATA[Announcing Oxfmt Alpha]]> https://voidzero.dev/posts/announcing-oxfmt-alpha https://voidzero.dev/posts/announcing-oxfmt-alpha Mon, 01 Dec 2025 00:00:00 GMT

For more technical details, implementations, and considerations for Oxfmt, see the blog post on the Oxc website.

TL;DR: Oxfmt is a fast Rust-based formatter that is available now in alpha stage and supports JavaScript and TypeScript files. It is 30x faster than Prettier while having >95% compatibility.

]]>

For more technical details, implementations, and considerations for Oxfmt, see the blog post on the Oxc website.

TL;DR: Oxfmt is a fast Rust-based formatter that is available now in alpha stage and supports JavaScript and TypeScript files. It is 30x faster than Prettier while having >95% compatibility.


VoidZero is excited to announce the alpha release of Oxfmt (/oh-eks-for-mat/), the Rust-based code formatter. This release focuses on JavaScript and TypeScript, with support for additional languages coming soon. Oxfmt is a welcome addition to VoidZero’s existing projects of build tool, bundler, test runner, and linter.

Why Oxfmt?

The first stable version of Oxlint was released half a year ago. Since then, there have been consistent requests for styling changes like sorting imports. We firmly believe that a linter checks for logic, while a formatter focuses on code style. These requests fall within the domain of a formatter.

However, existing tools don't always respect this boundary. Meaning additional configurations are needed to disable overlapping rules. By building both Oxfmt and Oxlint, we're able to reduce setup configurations and create a better developer experience.

Oxfmt is designed with these goals in mind:

  • Performance: More than 30× faster than Prettier and more than 3× faster than Biome on an initial run without cache (benchmark).
  • Compatibility: Prettier-compatible, so developers can adopt Oxfmt in existing projects easily.
  • Developer Experience: Upcoming features include import sorting, expanded formatting options, and support for Prettier plugins.

Prettier compatibility

VoidZero designs tools to be compatible with existing solutions because:

  1. It makes the developer’s migration path smooth and painless.
  2. It ladders towards becoming a complete replacement.

Like how Vitest is compatible with Jest, Oxfmt is designed to be compatible with Prettier.

Oxfmt currently passes around 95% of Prettier’s JavaScript and TypeScript tests. There should be minimal formatting differences when migrating from Prettier. VoidZero has been actively submitting bug reports and pull requests directly to Prettier to reduce the remaining differences. Many of these improvements landed in the recent Prettier 3.7 release.

While Oxfmt does not yet support all of Prettier’s configuration options, it does support major options like singleQuote, printWidth, and more. (full list)

Next steps

For Oxfmt’s beta release, VoidZero is working on:

  • Stabilizing experimental options. Specifically, built in support for import sorting and embedded language formatting like CSS-in-JS.
  • Enabling more Prettier plugins. Including researching how to support plugins for popular frameworks like Vue, Svelte, and Astro.
  • And more.

We’d love to hear your feedback on Oxfmt, and are excited to see how it helps improve your development workflow.

Connect with us:

Acknowledgements

VoidZero would like to extend our gratitude to:

  • @ematipico@MichaReiser, and the entire team and community at Biome and Rome. Oxfmt is builds on a fork of the biome_formatter infrastructure.
  • @fisker for triaging our reported issues for Prettier
]]>
<![CDATA[What’s New in ViteLand: October 2025 Recap]]> https://voidzero.dev/posts/whats-new-oct-2025 https://voidzero.dev/posts/whats-new-oct-2025 Mon, 03 Nov 2025 00:00:00 GMT Welcome to another edition of What’s new in ViteLand!

Regularly, we recap the project updates for Vite, Vitest, Rolldown, Oxc, and what’s happening in our community.

VoidZero Raises $12.5M Series A

VoidZero closes a $12.5M Series A to build the next generation of JavaScript Tooling.

Accel led the investment, with participation from Peak XV, Sunflower, Koen Bok, and Eric Simons to accelerate development of their OSS projects and work on their unified JavaScript toolchain Vite+.

The new funding significantly shortens the timeline to a public Vite+ release. Expect faster iteration cycles, more reliable native integrations, and quicker feature releases on our open source projects.

Coming up next:

  • The Vite 8 Beta, bringing you Vite powered by our Rust-based bundler Rolldown for faster builds, a unified build layer and further optimizations,
  • Vite's opt‑in Full Bundle Mode powered by Rolldown which speeds up the dev server for large projects significantly,
  • An alpha version of Oxfmt, our Prettier-compatible, Rust-based formatter that is ~45x faster than Prettier in initial benchmarks,
  • And expanded JS plugin support in Oxlint. With the first technical preview, ESLint rules already work with Oxlint but some API gaps still have to be filled.

Want to know more details? Check out the official announcement.

Project Updates

Vite+

ViteConf revealed Vite+: A superset of Vite and a CLI that integrates a suite of essential developer tools into a single, cohesive experience with first-class monorepo support and built-in caching.

Vite+ will be commercially licensed but source-available and will offer a free tier for open source projects, non-commercial use, and small businesses.

Learn more in the Vite+ announcement post.

Vite

  • The official Vite Documentary premiered at ViteConf and shows how Vite evolved from just a better Vue development server to the shared infrastructure of JavaScript tooling. A must-watch starring many community members and contributors!
  • The team released the first beta versions for the upcoming Vite 7.2. New features include an easy way to emit a license file during build and support for HTTP2 when using a proxy for the development server.

Vitest

  • We announced Vitest 4, which brings a lot of exciting new features, such as:
    • The Browser Mode which is now stable and allows running tests in an actual browser environment instead of a Node.js environment with a simulated DOM,
    • And built-in Visual Regression Testing into Vitest itself.
  • DX also improves further through the VS Code extension, for example via inlay-hints for console.log output or debugging tests running in Browser Mode via the extension.

Rolldown

  • Rolldown's inlineConst feature is getting even better! Now, named imports from CJS libraries can also be inlined, leading to smaller bundle sizes.
  • Rolldown's codebase is using Oxc's type-aware linting internally now! This is not only a great way for us to dogfood Oxlint, but it also helps us catch more issues and improve code quality.
  • output.minifyInternalExports is now enabled by default when enabling minification or using ESM as output format. While renaming exports seems to lead to a bigger bundle at first, it will actually decrease the final bundle size due to better GZIP compression. The original names for exports are still preserved to not break any code relying on them at runtime.
  • To change how Rolldown processes JSX, projects must use transform.jsx. The deprecated top-level jsx option is removed.
  • Generating source maps can be a big build performance hit, especially for large projects. To address this, Rolldown now supports source map generation through Rust via an own experimental MagicString implementation.
  • Rolldown can now tree-shake built-in typed array constructors (new Uint8Array(), new Int32Array(), etc.) as they are marked as pure internally.

Oxc

  • Oxlint now supports JS Plugins! They are ESLint-compatible and can be used to run existing ESLint rules in Oxlint. While some APIs are still missing, many popular rules already work out of the box. Read more in our announcement
  • Oxfmt is making great progress! Curious developers can try the pre-alpha version with npx oxfmt.

Upcoming Events

To catch talks and presentations from VoidZero team members, see the following events where they will present:

  • ViteConf happened in early October. A recap post and recordings on YouTube are available. Time to catch up!
  • Evan You gave a keynote on the challenges in building faster web tooling at JSConf US. The recording is available now!
  • Nov 20: JSNation. Alexander Lichter will present remotely with recent news about Rolldown and Oxc.
  • Nov 20: c't webdev. Catch Alexander Lichter's talk about reinventing JavaScript tooling.
  • Nov 28: React Advanced London. Alexander Lichter will give a remote talk on what the VoidZero tooling offers React developers.

From the Community

]]>
<![CDATA[VoidZero Raises $12.5M Series A]]> https://voidzero.dev/posts/announcing-series-a https://voidzero.dev/posts/announcing-series-a Thu, 30 Oct 2025 00:00:00 GMT Last year, we announced our seed round. The funding was used to validate whether VoidZero’s vision was possible. Is it possible to create a unified JavaScript toolchain that is faster, easier to use, and has a better DX than existing solutions?

Yes.

We’re excited to announce that we raised $12.5M in Series A funding led by our long-time partner Accel, with participation from Peak XV Partners, Sunflower Capital, Koen Bok (Framer), and Eric Simons (StackBlitz). We’re grateful to have investors who believe in our vision and allow us the breadth to bring it to life.

Accelerating the next chapter

If seed funding is for research, then series A funding is for acceleration. Our lean team has grown to increase development cycles and bandwidth. New team members include the creator of napi-rs and core contributors to our OSS projects who joined full-time.

Thanks to the team’s efforts:

  • Vite surpassed Webpack in weekly downloads
  • Vitest’s Browser Mode became stable
  • Rolldown reached 1M weekly downloads
  • Oxlint added type-aware linting and custom JavaScript plugins

Earlier this month, we announced Vite+, the unified JavaScript toolchain. It’s also our first step to making VoidZero and our OSS projects sustainable long-term. Vite+ is currently in private beta as we ship features and squash bugs. The new capital shortens the timeline to a stable Vite+ release.

To our community

Thank you. At the core of VoidZero is open source and the developer community. We’re honored that you have embraced our projects, are excited about VoidZero’s vision, and are part of our journey. This is only the beginning!

]]>
<![CDATA[ViteConf 2025 Recap]]> https://voidzero.dev/posts/whats-new-viteconf-2025 https://voidzero.dev/posts/whats-new-viteconf-2025 Mon, 27 Oct 2025 00:00:00 GMT Welcome to a special edition of What’s new in ViteLand! Regularly, we recap the project updates for Vite, Vitest, Oxc, Rolldown and what’s happening in our community.

People in the audience looking up to the stage
Attendees of ViteConf 2025 listening to the keynote

What happened at ViteConf 2025

ViteConf 2025 was truly special. For the first time ever, the community gathered in person in Amsterdam, bringing together framework developers, Vite maintainers, and enthusiasts from around the world.

Of course, ViteConf didn't pass without some major announcements, so here is a recap of the biggest news, brought to you by the VoidZero team.

PS: If you've missed it in person or didn't catch the live stream, take a look at the ​VODs in the ViteConf playlist​.

Evan You giving a talk at ViteConf 2025
Evan You showing an early version of Vite+ in action

✨ Meet Vite+: The Unified Toolchain

The biggest news from the keynote was the official unveiling of Vite+, a superset of Vite and command line interface that integrates a suite of essential developer tools into a single, cohesive experience with first-class monorepo support and built-in caching.

Vite+ extends the familiar vite CLI with new commands for testing, linting, formatting, library bundling, scaffolding and task running. It all works together out of the box with zero configuration and integrated caching.

Why it matters

Vite+ removes the need for a “tooling PhD.” By unifying all important parts to develop a production-ready application in one CLI, it dramatically simplifies project setup and reduces configuration overhead.

Teams gain a real productivity boost, while enterprises benefit from security, backed by clear SLAs and first-class caching. All without sacrificing Vite’s trademark speed and developer experience.

A standardized toolchain means better developer mobility and a shorter onboarding period for internal transfers.

Jim Dummett explains how to bridge the gap between Rust and JavaScript

⚡ Oxlint Supports JavaScript Plugins

Until now, Oxlint’s performance advantage came with a drawback: you had to continue running ESLint while using Oxlint. Even though more than 500 linting rules were rewritten in Rust, you couldn’t utilize your custom ESLint rules, or any of the rules that was not ported over yet.

Oxc Core Team member Jim Dummett presented at ViteConf what people were waiting for: Oxlint’s JavaScript Plugin support is now available.

This integration leverages the speed of the Rust-based linter while allowing the use of custom rules and extensions via an ESLint-compatible JavaScript API. Next steps are filling in the gaps of missing ESLint APIs and continuously improving performance.

Why it matters

Running JavaScript plugins in Oxlint unlocks the ability to use many of the existing 280,000 ESLint plugins on npm, as well as custom rules, without friction, while still benefiting from Oxc’s blazing-fast performance. Together with true type-aware linting, Oxlint is becoming a full replacement for ESLint: Thanks to its performance, features, and extensibility.

Anthony Fu is previewing the different parts of Vite DevTools

💻 A Sneak Peek at Vite DevTools

When building applications, developers must be able to inspect the build process to understand how different parts of their codebase and dependencies interact. To make this easier, Anthony Fu presents the new Vite DevTools at ViteConf, which include:

  1. A visual build analysis tool UI. It presents an overview over files used for bundling your application, which Vite plugins being used, and much more.
  2. The Vite DevTools plugin API. Framework and library authors, as well as users will be able to add their own extensions to Vite DevTools. This will allow customization tailored to the user’s tech stack.

Why it matters

Vite DevTools will give you insights that were hard to retrieve before, such as which plugins are slowing your build down, how different Vite plugins transform your code, why your code is split up differently in the final bundle and how the different files are related to each other.

Less time spend on debugging, more time spent shipping features

In addition to in-depth analysis of your application’s build step, you will get framework- and library-specific DevTools integrations, thanks to the DevTools plugin API and can even build your own extensions to meet your needs.

Pooya Parsa explains that Nitro v3 is "just a Vite plugin"

🌐 Full-stack Vite Apps with Nitro v3

If you’ve ever built a frontend app with Vite and then realized you need a backend for things like databases, user accounts, or you wanted to add server-side rendering (SSR), you know it can get complicated. Until now, you typically had three options:

  1. Go all-in on a meta-framework,
  2. manage a completely separate backend service,
  3. or work your way to get your own manual SSR setup.

Each path added complexity, forcing you to manage two different projects, deal with separate deployments, or a lot of manual plumbing and knowledge.

With Nitro v3, this changes now! The next-gen server toolkit can turn your application into a true full-stack application with minimal friction. The best part? Nitro now comes as a Vite plugin.

The creator, Pooya Parsa, shows how you can define API routes, easily set up SSR, and utilize serverless features with zero configuration. And thanks to Nitro, developers can deploy their entire Vite application to any platform, including Cloudflare Workers or Netlify.

Why it matters

You finally get the benefits of SSR and API routes while eliminating the complexity of managing and deploying separate backend services.

You can stick to your familiar, fast Vite environment and workflows for the whole stack, without opting into more features than you’d be interested in. All, while being able to deploy your application universally.

Vladimir Sheremet is giving his State of Vitest presentation at ViteConf 2025

✅ Vitest 4: No more Simulated DOM

Vitest’s journey was a major highlight at ViteConf, with lead maintainer Vladimir Sheremet outlining the existing State of Vitest and its path for the future. With the recent release, Vitest 4.0 delivers a stable Browser Mode, covered by Jessica Sachs’ talk at ViteConf. In turn, you can replace simulation DOM libraries like JSDOM or happy-dom with an actual browser, all without changing the actual test code. Among other features, support for Visual Regression Testing has been added. Vitest can now capture screenshots of your UI components, then compare them against reference images to detect unintended visual changes.

Why it matters

Browser Mode changes how you test UI components, giving you significantly higher confidence that your code will work for real users due to using a real browser. You no longer have to chase down subtle inconsistencies between your DOM library and actual browser behavior. With the new Visual Regression Testing functionality, you’ll have a safety net for your UI, automatically catching these unwanted visual changes before they reach production.

Jacob Groß, Performance Engineer at Framer sharing why Framer switched to Rolldown and what impact it made at ViteConf 2025

🚀 Ship Better Products Faster

As applications grow in complexity, engineering teams begin hitting walls: development slows down, CI builds drag on, and product performance suffers. VoidZero provides the tooling to help teams break through, enabling them to ship better products, faster.

Better products. Product speed is critical to user experience. For Framer, migrating to VoidZero’s Rolldown delivered a major speed improvement. By simply optimizing their configuration, they cut the Largest Contentful Paint (LCP) for large sites by up to 41%, all without touching application code. Rolldown now powers over 200,000 pages built with Framer, boosting the end-user experience.

Faster. Taking an action and immediately observing the effect is key to experimentation. A fast feedback loop to iterate on. Imagine trying to find which switch turns on a light but there's a 3 minute delay. Swap turning on a light with linting and that’s what Linear’s engineers experienced. It took 3 minutes to locally lint Linear’s codebase. The exact opposite of a fast feedback loop. That’s why Linear adopted VoidZero’s Oxlint, which reduced linting times 90%, along with Rolldown-Vite and Vitest. Fast tools matter for developer experience and developer velocity.

Why it matters

Companies like Framer and Linear are known for being fast. Their products are snappy and create delightful user experiences. Under the hood, they rely on equally fast tooling. VoidZero's products have a meaningful impact on developer velocity and the end-user experience.

]]>
Welcome to a special edition of What’s new in ViteLand! Regularly, we recap the project updates for Vite, Vitest, Oxc, Rolldown and what’s happening in our community.

People in the audience looking up to the stage
Attendees of ViteConf 2025 listening to the keynote

What happened at ViteConf 2025

ViteConf 2025 was truly special. For the first time ever, the community gathered in person in Amsterdam, bringing together framework developers, Vite maintainers, and enthusiasts from around the world.

Of course, ViteConf didn't pass without some major announcements, so here is a recap of the biggest news, brought to you by the VoidZero team.

PS: If you've missed it in person or didn't catch the live stream, take a look at the ​VODs in the ViteConf playlist​.

Evan You giving a talk at ViteConf 2025
Evan You showing an early version of Vite+ in action

✨ Meet Vite+: The Unified Toolchain

The biggest news from the keynote was the official unveiling of Vite+, a superset of Vite and command line interface that integrates a suite of essential developer tools into a single, cohesive experience with first-class monorepo support and built-in caching.

Vite+ extends the familiar vite CLI with new commands for testing, linting, formatting, library bundling, scaffolding and task running. It all works together out of the box with zero configuration and integrated caching.

Why it matters

Vite+ removes the need for a “tooling PhD.” By unifying all important parts to develop a production-ready application in one CLI, it dramatically simplifies project setup and reduces configuration overhead.

Teams gain a real productivity boost, while enterprises benefit from security, backed by clear SLAs and first-class caching. All without sacrificing Vite’s trademark speed and developer experience.

A standardized toolchain means better developer mobility and a shorter onboarding period for internal transfers.

Jim Dummett explains how to bridge the gap between Rust and JavaScript

⚡ Oxlint Supports JavaScript Plugins

Until now, Oxlint’s performance advantage came with a drawback: you had to continue running ESLint while using Oxlint. Even though more than 500 linting rules were rewritten in Rust, you couldn’t utilize your custom ESLint rules, or any of the rules that was not ported over yet.

Oxc Core Team member Jim Dummett presented at ViteConf what people were waiting for: Oxlint’s JavaScript Plugin support is now available.

This integration leverages the speed of the Rust-based linter while allowing the use of custom rules and extensions via an ESLint-compatible JavaScript API. Next steps are filling in the gaps of missing ESLint APIs and continuously improving performance.

Why it matters

Running JavaScript plugins in Oxlint unlocks the ability to use many of the existing 280,000 ESLint plugins on npm, as well as custom rules, without friction, while still benefiting from Oxc’s blazing-fast performance. Together with true type-aware linting, Oxlint is becoming a full replacement for ESLint: Thanks to its performance, features, and extensibility.

Anthony Fu is previewing the different parts of Vite DevTools

💻 A Sneak Peek at Vite DevTools

When building applications, developers must be able to inspect the build process to understand how different parts of their codebase and dependencies interact. To make this easier, Anthony Fu presents the new Vite DevTools at ViteConf, which include:

  1. A visual build analysis tool UI. It presents an overview over files used for bundling your application, which Vite plugins being used, and much more.
  2. The Vite DevTools plugin API. Framework and library authors, as well as users will be able to add their own extensions to Vite DevTools. This will allow customization tailored to the user’s tech stack.

Why it matters

Vite DevTools will give you insights that were hard to retrieve before, such as which plugins are slowing your build down, how different Vite plugins transform your code, why your code is split up differently in the final bundle and how the different files are related to each other.

Less time spend on debugging, more time spent shipping features

In addition to in-depth analysis of your application’s build step, you will get framework- and library-specific DevTools integrations, thanks to the DevTools plugin API and can even build your own extensions to meet your needs.

Pooya Parsa explains that Nitro v3 is "just a Vite plugin"

🌐 Full-stack Vite Apps with Nitro v3

If you’ve ever built a frontend app with Vite and then realized you need a backend for things like databases, user accounts, or you wanted to add server-side rendering (SSR), you know it can get complicated. Until now, you typically had three options:

  1. Go all-in on a meta-framework,
  2. manage a completely separate backend service,
  3. or work your way to get your own manual SSR setup.

Each path added complexity, forcing you to manage two different projects, deal with separate deployments, or a lot of manual plumbing and knowledge.

With Nitro v3, this changes now! The next-gen server toolkit can turn your application into a true full-stack application with minimal friction. The best part? Nitro now comes as a Vite plugin.

The creator, Pooya Parsa, shows how you can define API routes, easily set up SSR, and utilize serverless features with zero configuration. And thanks to Nitro, developers can deploy their entire Vite application to any platform, including Cloudflare Workers or Netlify.

Why it matters

You finally get the benefits of SSR and API routes while eliminating the complexity of managing and deploying separate backend services.

You can stick to your familiar, fast Vite environment and workflows for the whole stack, without opting into more features than you’d be interested in. All, while being able to deploy your application universally.

Vladimir Sheremet is giving his State of Vitest presentation at ViteConf 2025

✅ Vitest 4: No more Simulated DOM

Vitest’s journey was a major highlight at ViteConf, with lead maintainer Vladimir Sheremet outlining the existing State of Vitest and its path for the future. With the recent release, Vitest 4.0 delivers a stable Browser Mode, covered by Jessica Sachs’ talk at ViteConf. In turn, you can replace simulation DOM libraries like JSDOM or happy-dom with an actual browser, all without changing the actual test code. Among other features, support for Visual Regression Testing has been added. Vitest can now capture screenshots of your UI components, then compare them against reference images to detect unintended visual changes.

Why it matters

Browser Mode changes how you test UI components, giving you significantly higher confidence that your code will work for real users due to using a real browser. You no longer have to chase down subtle inconsistencies between your DOM library and actual browser behavior. With the new Visual Regression Testing functionality, you’ll have a safety net for your UI, automatically catching these unwanted visual changes before they reach production.

Jacob Groß, Performance Engineer at Framer sharing why Framer switched to Rolldown and what impact it made at ViteConf 2025

🚀 Ship Better Products Faster

As applications grow in complexity, engineering teams begin hitting walls: development slows down, CI builds drag on, and product performance suffers. VoidZero provides the tooling to help teams break through, enabling them to ship better products, faster.

Better products. Product speed is critical to user experience. For Framer, migrating to VoidZero’s Rolldown delivered a major speed improvement. By simply optimizing their configuration, they cut the Largest Contentful Paint (LCP) for large sites by up to 41%, all without touching application code. Rolldown now powers over 200,000 pages built with Framer, boosting the end-user experience.

Faster. Taking an action and immediately observing the effect is key to experimentation. A fast feedback loop to iterate on. Imagine trying to find which switch turns on a light but there's a 3 minute delay. Swap turning on a light with linting and that’s what Linear’s engineers experienced. It took 3 minutes to locally lint Linear’s codebase. The exact opposite of a fast feedback loop. That’s why Linear adopted VoidZero’s Oxlint, which reduced linting times 90%, along with Rolldown-Vite and Vitest. Fast tools matter for developer experience and developer velocity.

Why it matters

Companies like Framer and Linear are known for being fast. Their products are snappy and create delightful user experiences. Under the hood, they rely on equally fast tooling. VoidZero's products have a meaningful impact on developer velocity and the end-user experience.


Thank you to everyone who joined us in Amsterdam, watched the live streams or the ​VODs on YouTube​!

The VoidZero Team

]]>
<![CDATA[Announcing Vitest 4.0]]> https://voidzero.dev/posts/announcing-vitest-4 https://voidzero.dev/posts/announcing-vitest-4 Wed, 22 Oct 2025 00:00:00 GMT

For more technical details about Vitest 4.0, see the blog post on the Vitest website or watch Vladimir's and Jessica's talk at ViteConf.

TL;DR: Vitest 4.0 is released with Browser Mode being marked stable, Visual Regression testing support, and Playwright Trace support. The Vitest team will focus on performance improvement in the upcoming quarter. This major release includes breaking changes.

]]>

For more technical details about Vitest 4.0, see the blog post on the Vitest website or watch Vladimir's and Jessica's talk at ViteConf.

TL;DR: Vitest 4.0 is released with Browser Mode being marked stable, Visual Regression testing support, and Playwright Trace support. The Vitest team will focus on performance improvement in the upcoming quarter. This major release includes breaking changes.


Vitest’s usage has grown from 7M to 17M weekly downloads in the past year. Almost one year after the last major release, we’re excited to announce Vitest 4.0 is now available.

Browser Mode is stable

Browser Mode is now stable as part of Vitest 4.0 after being in beta. This has been one of Vitest’s longest-requested features. Browser Mode allows developers to test their UI components in a real browser environment instead of a simulated one like JSDOM.

Because tests run in the browser natively, it provides much higher confidence that components will look and behave correctly for actual users. All while using the same familiar Vitest API without any code changes.

Vitest Browser Mode uses providers like Playwright to run your tests in a real browser. It isn't a new testing framework and doesn't replace E2E tools; it just changes the environment where your existing tests run.

Additional updates

In tandem with Browser Mode, Vitest 4.0 adds support for Visual Regression Test and Playwright Traces:

  • Visual Regression testing in Browser mode takes screenshots of components and compares them against a reference image to catch unintended visual changes during the test.
  • Playwright Traces generates detailed trace files that can be analyzed in Playwright's Trace Viewer for easier debugging.

Other features include reporter updates, type-aware hooks, and more. Read more here.

Breaking changes

Vitest 4.0 is a major release and comes with breaking changes. Please review the migration guide before upgrading.

What’s next

The team will continue to polish Browser Mode while improving overall performance for Vitest.

We would love to hear your feedback on Vitest 4.0, and are excited to see how it helps improve your development workflow.

Connect with us:

]]>
<![CDATA[Announcing Oxlint JavaScript Plugin Support]]> https://voidzero.dev/posts/announcing-oxlint-js-plugins https://voidzero.dev/posts/announcing-oxlint-js-plugins Mon, 20 Oct 2025 00:00:00 GMT

For more technical details, implementations, and considerations for Oxlint's support for JavaScript plugins, see the blog post on the Oxc website or watch the ViteConf talk.

TL;DR: Oxlint now supports plugins written in JavaScript. Developers can customize and extend Oxlint using JavaScript, but at a speed approaching Rust, due to 'raw transfer' between Rust and JS, and other breakthroughs. Many ESLint plugins can run without any modification.

]]>

For more technical details, implementations, and considerations for Oxlint's support for JavaScript plugins, see the blog post on the Oxc website or watch the ViteConf talk.

TL;DR: Oxlint now supports plugins written in JavaScript. Developers can customize and extend Oxlint using JavaScript, but at a speed approaching Rust, due to 'raw transfer' between Rust and JS, and other breakthroughs. Many ESLint plugins can run without any modification.


VoidZero set out with an ambitious goal to improve DX and build a unified toolchain that is fast. Developers have been eagerly waiting for a linter that can function at native speed and is a complete replacement for existing solutions. Oxlint, written in Rust, is VoidZero’s answer. The v1.0 stable version was released in Jun 2025 and type-aware linting was released in Aug 2025. Today, we’re excited to announce another major milestone:

Oxlint supports plugins written in JavaScript!

Why JavaScript plugins matter

Taking a step back, why did tools like Vite, ESLint and Rollup become so popular? While there are many contributing factors, we believe a major reason is customizability. Their plugin ecosystem empowered developers to author plugins and tailor behaviors to their specific needs. And then share the plugins via NPM, building an ecosystem. This makes sense for tools written in JavaScript and used by JavaScript developers.

But what happens when the tool is written in Rust and used JavaScript developers? How will JavaScript developers customize Oxlint? The two naive solutions:

  1. Everyone learns Rust.
  2. Tooling authors rewrite every single plugin in Rust.

Hopefully, no explanation is needed as to why solution one is unrealistic. While VoidZero has already rewritten many commonly used linting plugins in Rust, it’s unrealistic to rewrite and maintain the entire universe of JavaScript plugins. Including the “long tail” of plugins that have a small user base but are essential and project-specific plugins.

Oxlint is stuck between a rock and a hard place. The Rust architecture that drastically improved performance also makes it less accessible to developers who only “speak” JavaScript. A direct departure from the company’s goal to improve DX.

Is there another solution?

JavaScript-Rust Transfer

A middle ground that lets JavaScript developers continue using the language they know best while enjoying Rust’s exceptional speed. The conventional method, involves doing the heavy processing in Rust, and then transfering the resulting processed code over to JavaScript, where JavaScript can quickly apply additional tweaks.

Simple in theory. Disastrous in practice. The issue is the cost of the transfer between the "two worlds" of Rust and JavaScript. If the main reason to switch to Oxlint is speed improvements, then the transfer’s overhead would render the switch's reason almost meaningless.

What if, instead, the processed code was transparently shared between JavaScript and Rust, instead of transferred? If there was no transfer, and almost no cost at all?

Over the past year, this is exactly what the Oxc team have achieved, a unique architecture designed to completely destroy the traditional language barrier. This, along with other optimizations, enabled Oxlint plugins written in JavaScript to run 86% faster, or at a speed comparable to Rust. More implementation details here.

Bar chart showing Oxlint speed going from 1,360ms to 189ms with JavaScript plugins
Oxlint speed going from 1,360ms to 189ms with JavaScript plugins

Next steps

Oxlint’s JavaScript plugin functionality is in its early stages, with further optimizations and implementations in the works. When that happens, because Oxlint plugins are ESLint-compatible, users will soon be able to fully switch from ESLint to Oxlint in under an hour. Everything will simply work just as before, but an order of magnitude faster. We believe that we can continue to reduce the transfer overhead to the point where it becomes almost negligible.

If a fast linter is something you are interested in, we ask that you test it. It’s easy to get started since Oxlint can run many ESLint plugins without any modifications. An alternative API is also available to unlock even better performance. The more feedback, both positive and negative, we get, the more robust Oxlint’s JavaScript plugins support becomes.

Connect with us:

]]>
<![CDATA[Announcing Vite+]]> https://voidzero.dev/posts/announcing-vite-plus https://voidzero.dev/posts/announcing-vite-plus Mon, 13 Oct 2025 00:00:00 GMT Announcing Vite+: a unified toolchain for JavaScript

Last week, we unveiled Vite+ at the first-ever in-person ViteConf in Amsterdam. In this post, we’ll share more details about what it is and the motivation behind it.

What is Vite+?

Vite+ is a command-line developer tool you can install from npm, just like Vite itself. It’s a drop-in upgrade to Vite with additional features. Imagine that, in addition to vite dev and vite build, you can now also run:

  • vite new — for scaffolding new projects, especially monorepos, with recommended structure that works best with Vite+. It also supports generating code, for example adding a new package to the monorepo, or invoking custom generators.

  • vite test — run unit tests powered by Vitest. It provides a Jest-compatible API, works out of the box with your main application, and includes comprehensive features like browser mode, sharding, visual regression testing, and more.

  • vite lint — lint your code with Oxlint, which ships with 600+ ESLint-compatible rules and is up to 100× faster than ESLint. It also supports type-aware linting, and plugins written in JavaScript with an ESLint-compatible API.

  • vite fmt — format your code with Oxfmt (to be released soon), which aims for 99%+ Prettier compatibility while offering more control and flexibility, such as finer control over line wrapping.

  • vite lib — bundle libraries with best practices baked in, powered by tsdown and Rolldown. It includes blazing-fast bundled DTS generation via the isolatedDeclarations transform.

  • vite run — a built-in monorepo task runner with intelligent caching. We’ve implemented sophisticated task input inference so that most tasks can be cached without explicit configuration—often with even better granularity than manual setups. Think Turborepo, but without having to tell the system how to invalidate the cache.

  • vite ui — GUI devtools that offer insights into module resolve / transform behavior, bundle size / tree-shaking analysis, and integration with framework-specific devtools.

All these commands work seamlessly together out of the box, without the need for complex configuration or compatibility plumbing. Vite+ inherits Vite's thriving ecosystem so it is compatible with mainstream frameworks like React and Vue, and also fullstack meta frameworks like Tanstack Start and SvelteKit. And because each command is built on or compatible with widely adopted tools, adopting Vite+ doesn’t require massive refactoring if you are already using those tools.

The entire suite of commands is built on a shared foundation to ensure cohesion and consistency. We’ve implemented the full compiler toolchain in Rust—from the parser to the resolver, transformer, minifier, and bundler — with obsessive performance tuning at every level. All of this infrastructure is open source and already widely adopted by companies like Framer (case study), Linear, Atlassian, Shopify, and more. Utilities such as parse and transform are also exported from Vite+ as APIs to support custom tooling.

You can watch the demo from Evan’s talk at ViteConf to get a sneak peek of Vite+ in action.

The problem Vite+ is trying to solve

The JavaScript tooling ecosystem has seen its fair share of fragmentation and churn over the years. As a language originally designed in 10 days, no one could have imagined we’d be building apps of today’s scale and complexity with JavaScript. Tooling complexity and performance have become real bottlenecks for companies facing ever-larger web projects but limited internal tooling resources.

These bottlenecks are more severe for companies with multiple teams, each using their own unique tooling choices. Tasks like dependency management or security reviews must be handled separately for each team. Dependency combinations drift out of sync between projects and get harder and harder to reconcile over time. If teams or projects are merged, then developer time must be paid to migrate tools or end up with a convoluted, Frankensteined tool stack.

With Vite+, we aim to offer a unified solution for JavaScript tooling — so teams facing these challenges can focus on shipping products instead of wasting time evaluating, bike-shedding, configuring, and debugging their tool stacks.

Licensing

Sustainability has always been a challenge for open source developer tooling. Our goal with Vite+ is to capture a portion of the value it creates at scale for larger organizations and reinvest it into the open source projects that power Vite+.

To ensure the wider community benefits as well, Vite+ will be free for individuals, open source projects, and small businesses. We plan to offer flat annual license pricing for startups and custom pricing for enterprises. Though Vite+ will be commercially licensed, it will be source-available. Exact tier thresholds and licensing details will be announced closer to public launch.

We understand that commercialization on top of open source can raise concerns within the community. Through our years of open source work, we know this balance is delicate and built on trust. We’re committed to pursuing a commercialization path that serves the best interests of both our OSS and commercial users.

It’s important to emphasize that Vite+ is a separate, additive layer built on top of the open source projects we maintain. All existing projects — Vite, Vitest, Rolldown, and Oxc — will remain open source under MIT forever. Since Vite+ relies on these open source projects, improving Vite+ requires improving them as well. You are welcome to hold us accountable to that commitment.

Help us shape Vite+!

Vite+ is still in development, and we’re targeting a public preview in early 2026. We’re currently looking for early adopters to help test-drive it in production environments.

If you’re interested, visit viteplus.dev and get in touch!

]]>
Announcing Vite+: a unified toolchain for JavaScript

Last week, we unveiled Vite+ at the first-ever in-person ViteConf in Amsterdam. In this post, we’ll share more details about what it is and the motivation behind it.

What is Vite+?

Vite+ is a command-line developer tool you can install from npm, just like Vite itself. It’s a drop-in upgrade to Vite with additional features. Imagine that, in addition to vite dev and vite build, you can now also run:

  • vite new — for scaffolding new projects, especially monorepos, with recommended structure that works best with Vite+. It also supports generating code, for example adding a new package to the monorepo, or invoking custom generators.

  • vite test — run unit tests powered by Vitest. It provides a Jest-compatible API, works out of the box with your main application, and includes comprehensive features like browser mode, sharding, visual regression testing, and more.

  • vite lint — lint your code with Oxlint, which ships with 600+ ESLint-compatible rules and is up to 100× faster than ESLint. It also supports type-aware linting, and plugins written in JavaScript with an ESLint-compatible API.

  • vite fmt — format your code with Oxfmt (to be released soon), which aims for 99%+ Prettier compatibility while offering more control and flexibility, such as finer control over line wrapping.

  • vite lib — bundle libraries with best practices baked in, powered by tsdown and Rolldown. It includes blazing-fast bundled DTS generation via the isolatedDeclarations transform.

  • vite run — a built-in monorepo task runner with intelligent caching. We’ve implemented sophisticated task input inference so that most tasks can be cached without explicit configuration—often with even better granularity than manual setups. Think Turborepo, but without having to tell the system how to invalidate the cache.

  • vite ui — GUI devtools that offer insights into module resolve / transform behavior, bundle size / tree-shaking analysis, and integration with framework-specific devtools.

All these commands work seamlessly together out of the box, without the need for complex configuration or compatibility plumbing. Vite+ inherits Vite's thriving ecosystem so it is compatible with mainstream frameworks like React and Vue, and also fullstack meta frameworks like Tanstack Start and SvelteKit. And because each command is built on or compatible with widely adopted tools, adopting Vite+ doesn’t require massive refactoring if you are already using those tools.

The entire suite of commands is built on a shared foundation to ensure cohesion and consistency. We’ve implemented the full compiler toolchain in Rust—from the parser to the resolver, transformer, minifier, and bundler — with obsessive performance tuning at every level. All of this infrastructure is open source and already widely adopted by companies like Framer (case study), Linear, Atlassian, Shopify, and more. Utilities such as parse and transform are also exported from Vite+ as APIs to support custom tooling.

You can watch the demo from Evan’s talk at ViteConf to get a sneak peek of Vite+ in action.

The problem Vite+ is trying to solve

The JavaScript tooling ecosystem has seen its fair share of fragmentation and churn over the years. As a language originally designed in 10 days, no one could have imagined we’d be building apps of today’s scale and complexity with JavaScript. Tooling complexity and performance have become real bottlenecks for companies facing ever-larger web projects but limited internal tooling resources.

These bottlenecks are more severe for companies with multiple teams, each using their own unique tooling choices. Tasks like dependency management or security reviews must be handled separately for each team. Dependency combinations drift out of sync between projects and get harder and harder to reconcile over time. If teams or projects are merged, then developer time must be paid to migrate tools or end up with a convoluted, Frankensteined tool stack.

With Vite+, we aim to offer a unified solution for JavaScript tooling — so teams facing these challenges can focus on shipping products instead of wasting time evaluating, bike-shedding, configuring, and debugging their tool stacks.

Licensing

Sustainability has always been a challenge for open source developer tooling. Our goal with Vite+ is to capture a portion of the value it creates at scale for larger organizations and reinvest it into the open source projects that power Vite+.

To ensure the wider community benefits as well, Vite+ will be free for individuals, open source projects, and small businesses. We plan to offer flat annual license pricing for startups and custom pricing for enterprises. Though Vite+ will be commercially licensed, it will be source-available. Exact tier thresholds and licensing details will be announced closer to public launch.

We understand that commercialization on top of open source can raise concerns within the community. Through our years of open source work, we know this balance is delicate and built on trust. We’re committed to pursuing a commercialization path that serves the best interests of both our OSS and commercial users.

It’s important to emphasize that Vite+ is a separate, additive layer built on top of the open source projects we maintain. All existing projects — Vite, Vitest, Rolldown, and Oxc — will remain open source under MIT forever. Since Vite+ relies on these open source projects, improving Vite+ requires improving them as well. You are welcome to hold us accountable to that commitment.

Help us shape Vite+!

Vite+ is still in development, and we’re targeting a public preview in early 2026. We’re currently looking for early adopters to help test-drive it in production environments.

If you’re interested, visit viteplus.dev and get in touch!


FAQs

  • How do frameworks benefit from Vite+?

    Vite+ is an integration on the application-level. Frameworks don't need to rewrite their internals for developers to benefit. They can provide Vite+ plugins for a smoother developer experience (DX) and tighter integration, but they won't have to switch to Vite+ themselves.

    Open-source framework repositories can use Vite+ themselves for free in their own development workflows.

  • Is Vite paid software now?

    No. Vite remains free and open-source under the MIT license. VoidZero stated from the beginning that all code and packages released as open-source will remain open-source.

    Vite+ is a separate product that provides additional features and services as a superset of Vite to provide an end-to-end JavaScript tooling solution.

  • What impact will Vite+ have on Vite and other open-source projects?

    Vite+ is built on top of the existing open-source ecosystem, and can only function if that ecosystem is healthy. As a paid product, Vite+ supports the sustainability of the underlying open-source projects it builds on.

  • Can I use Vite+ with my own tooling?

    Yes. Vite+ aims to give the best DX and performance but does not force you to use only Vite+ tools and commands. Its task runner and caching will work with arbitrary tasks, not just with built-in Vite+ commands.

]]>
<![CDATA[How Framer reduced LCP using Rolldown]]> https://voidzero.dev/posts/case-study-framer-rolldown https://voidzero.dev/posts/case-study-framer-rolldown Tue, 07 Oct 2025 00:00:00 GMT Framer is a no-code website builder that lets customers like Miro, Perplexity, and Scale AI rapidly create and update dynamic websites without requiring developer support. The key to their growth is speed. Framer lets their customers ship fast sites. Sites as fast or faster than having a team of the best frontend developers. That means constantly looking at ways to improve their customers’ core web vitals like Large Contentful Paint, LCP.

The challenge: Improving bundling and chunk splitting

Like most modern websites, Framer bundles their customers’ websites to make the websites faster. The purpose of bundling is to reduce the total number of network requests. But a single big JavaScript file isn’t ideal either - in order to achieve the best possible performance, bundles are often split into “chunks”, and an optimal balance between chunk size and the number of chunks is needed. This relies on a bundler feature called “chunk splitting”.

Chunks are files of optimized code created by bundlers and loaded on demand by browsers.

Framer used an open-source bundler called esbuild, which redefined performance for JavaScript bundlers. Esbuild was significantly faster than everything else at that time, and it allowed Framer to provide a fast publishing experience when their customers made changes to their websites.

However, esbuild has limitations when it comes to chunk splitting. It sometimes produces too many chunks, affecting the website’s LCP. Framer’s performance engineers were looking for ways to improve LCP with a better chunk splitting strategy. The team had explored alternative bundlers like Rollup, Parcel, and farm-fe - but none were comparable to esbuild’s bundling speed and would negatively impact their customer’s publishing speed. This was when Rolldown came into consideration.

The solution: Rolldown’s advancedChunks unlocked flexibility

Rolldown was the new kid on the block. Written in Rust, it promised bundling speed that matches esbuild’s. As important is a configuration feature Rolldown had called advancedChunks, which gives users the option to fine-tune how chunks are created. Instead of relying on a standard algorithm to bundle and split their code, users can set chunking rules to better fit their code repository. This flexibility is a powerful feature and was exactly what Framer was looking for.

The Framer team made a bet. They became a VoidZero customer and started working directly with VoidZero’s Rolldown team. The goal was to fine-tune chunking and improve LCP. The partnership launched a series of rapid Rolldown iterations.

VoidZero prioritized Framer's feature requests and issues. When new versions of Rolldown were released, Framer quickly provided feedback on their impact on code chunks. For instance, Framer requested a rule to place code shared by multiple files into separate chunks, leading Rolldown to add the minShareCount option.

The results: Chunks decreased by 67% and LCP improved

Framer’s Rolldown implementation decreased chunks by 67% for the majority of Framer’s websites. But how did Framer progress against their ultimate goal of faster LCP? Did the optimized chunks improve caching and make load times faster for returning visitors? Let’s take a look.

| Website size | LCP* | | :

]]>
Framer is a no-code website builder that lets customers like Miro, Perplexity, and Scale AI rapidly create and update dynamic websites without requiring developer support. The key to their growth is speed. Framer lets their customers ship fast sites. Sites as fast or faster than having a team of the best frontend developers. That means constantly looking at ways to improve their customers’ core web vitals like Large Contentful Paint, LCP.

The challenge: Improving bundling and chunk splitting

Like most modern websites, Framer bundles their customers’ websites to make the websites faster. The purpose of bundling is to reduce the total number of network requests. But a single big JavaScript file isn’t ideal either - in order to achieve the best possible performance, bundles are often split into “chunks”, and an optimal balance between chunk size and the number of chunks is needed. This relies on a bundler feature called “chunk splitting”.

Chunks are files of optimized code created by bundlers and loaded on demand by browsers.

Framer used an open-source bundler called esbuild, which redefined performance for JavaScript bundlers. Esbuild was significantly faster than everything else at that time, and it allowed Framer to provide a fast publishing experience when their customers made changes to their websites.

However, esbuild has limitations when it comes to chunk splitting. It sometimes produces too many chunks, affecting the website’s LCP. Framer’s performance engineers were looking for ways to improve LCP with a better chunk splitting strategy. The team had explored alternative bundlers like Rollup, Parcel, and farm-fe - but none were comparable to esbuild’s bundling speed and would negatively impact their customer’s publishing speed. This was when Rolldown came into consideration.

The solution: Rolldown’s advancedChunks unlocked flexibility

Rolldown was the new kid on the block. Written in Rust, it promised bundling speed that matches esbuild’s. As important is a configuration feature Rolldown had called advancedChunks, which gives users the option to fine-tune how chunks are created. Instead of relying on a standard algorithm to bundle and split their code, users can set chunking rules to better fit their code repository. This flexibility is a powerful feature and was exactly what Framer was looking for.

The Framer team made a bet. They became a VoidZero customer and started working directly with VoidZero’s Rolldown team. The goal was to fine-tune chunking and improve LCP. The partnership launched a series of rapid Rolldown iterations.

VoidZero prioritized Framer's feature requests and issues. When new versions of Rolldown were released, Framer quickly provided feedback on their impact on code chunks. For instance, Framer requested a rule to place code shared by multiple files into separate chunks, leading Rolldown to add the minShareCount option.

The results: Chunks decreased by 67% and LCP improved

Framer’s Rolldown implementation decreased chunks by 67% for the majority of Framer’s websites. But how did Framer progress against their ultimate goal of faster LCP? Did the optimized chunks improve caching and make load times faster for returning visitors? Let’s take a look.

Website size LCP*
Small (<=1 MB) ~0%
Medium (1-2 MB) -4%
Large (2+ MB) -41%

The bigger the website, the more room for optimization. Small Framer websites, like hobby projects or simple landing pages, already load quickly and have little overhead to further reduce. Whereas, large Framer websites reduced LCP by a whopping 41%! Framer’s customers were delighted.

"I’ve been optimising every single page with small incremental gains and then suddenly BANG after the Rolldown drop. Thank you performance team!"

Tristan Owen - UX/UI Web Designer & Framer Expert

"Site performance and publish speed seem significantly better … I just didn’t expect such a visible increase in perf."

Ben Gold - Freelancer & Framer Expert

Next steps

Framer has 2 distinct and very different build artifacts:

  1. Framer website, the website customers publish
  2. Framer canvas, the web app customers design their website in

All the Framer websites now use Rolldown in the build process. The next step is to migrate the Framer canvas build process to Rolldown to load faster, improve publishing times, and even reduce storage costs.

In the meantime, Framer is also looking to adopt other VoidZero projects like the native-speed linter, Oxlint, and formatter, when available.

* Metrics are based on the 75th percentile, which is how Core Web Vitals, like LCP, are calculated. Numbers are TTFB adjusted.

]]>
<![CDATA[VoidZero's 2025 Open Source Pledge Report]]> https://voidzero.dev/posts/oss-pledge-2025 https://voidzero.dev/posts/oss-pledge-2025 Mon, 06 Oct 2025 00:00:00 GMT TLDR: VoidZero is continuing our commitment to the Open Source Pledge and donating $48,360 or $3,454 per VoidZero developer to external open source projects.

]]>
TLDR: VoidZero is continuing our commitment to the Open Source Pledge and donating $48,360 or $3,454 per VoidZero developer to external open source projects.


A year ago, VoidZero joined the Open Source Pledge, an initiative to encourage companies to contribute to the OSS community and ecosystem. Coming from a team of prolific open source maintainers and contributors, we understand the criticality for companies like VoidZero to sponsor projects:

  • OSS project donations create outsized value and impact due to wide adoption.
  • Donations from individuals do not scale and have less predictability for maintainers.
  • Cultural and bureaucratic barriers prevent more companies from donating to OSS.

We’re proud to continue our commitment and increase our sponsorship in lock step with the company's growth.

Who Are We Sponsoring?

Our funding selection criteria remains unchanged since last year. We are starting with the dependencies that our projects directly rely on, and individual developers that we believe are producing meaningful impact for the entire JavaScript ecosystem.

In the first year, we donated $31,488 to projects across the JavaScript and Rust ecosystems. As our team has grown, so have our sponsorships. We are donating a total of $48,360 per year, which is $3,454 per VoidZero developer.

Notable projects we are now sponsoring include ESLint and typescript-eslint, whose work we rely on heavily. It is thanks to them that VoidZero can build features like Oxlint’s type-aware linting.

Here is a detailed breakdown of who we are funding:

Project(s) Developer Monthly Recurring Amount
Multiple Anthony Fu (@antfu)* $1,012
Rollup ** $800
UnJS Pooya Parsa (@pi0) $512
ESLint ** $500
pnpm ** $500
typescript-eslint ** $300
Lightning CSS Devon Govett (@devongovett) $256
PostCSS ** $150

* Anthony contributes to Vite and Vitest, which are projects related to VoidZero's ecosystem, but also maintains many other projects. We are sponsoring Anthony directly for $1024/month and counting half of that towards the pledge. Additionally, we sponsor Anthony’s Open Collective account for $500/month, which is then re-distributed to underlying dependencies across the ecosystem.

**Donations made to projects' respective Open Collective accounts instead of individuals.

]]>
<![CDATA[What’s New in ViteLand: September 2025 Recap]]> https://voidzero.dev/posts/whats-new-sep-2025 https://voidzero.dev/posts/whats-new-sep-2025 Wed, 01 Oct 2025 00:00:00 GMT Welcome to another edition of What’s new in ViteLand! Every month, we recap the project updates for Vite, Vitest, Oxc, Rolldown and what’s happening in our community.

Rolldown gets up to 45% faster, and also smaller!

Rolldown is the fastest JavaScript bundler, but that does not stop us from striving for even better performance. In the last month, three significant improvements were made!

  1. Re-implemented the file metadata retrieval function for Windows. Rust's default standard library runs at suboptimal speed on Windows, and since Rolldown processes large numbers of files during bundling, the new retrieval function improved performance 10-30% for Rolldown builds on Windows.
  2. Optimized multi-threaded I/O operations specifically for macOS. Counterintuitively, a higher number of threads opening files on a multi-core macOS system causes a performance degradation instead of expected improvements. In turn, we reduced the number of threads to four, which resulted in 10-45% faster bundling on macOS.
  3. Improved source map ignore list. Building with source maps enabled is now 20-30% faster because of reduced Rust-to-JS function calls.

These performance improvements translate directly into faster build times and a smoother development workflow for Rolldown and Oxc users.

Further, the Rolldown team worked on reducing Rolldown's binary size. It was possible to shave off 200KB due to a new way of integrating Oxc into Rolldown. Instead of depending on the @oxc/runtime package directly, the helpers from the package will now be embedded directly into the Rolldown binary during compilation.

Project Updates

Vite

Vitest

Rolldown

Oxc

  • Oxlint is becoming faster and faster! Oxc core team member Cam McHenry submitted two PRs which improve linting speed in real-world codebases by 5-50%.
  • Mentioning every performance improvement for Oxc would be too long for this recap post. So, we've compiled a list of all the performance-related PRs, grouped by topic and performance gains.
  • Oxlint now supports the popular preserve-caught-error rule, and provides an auto-fix for it!
  • Boshen tackled and resolved a nasty memory leak issue with Oxlint when using the import plugin.
  • Oxlint can be used within various setups. For all Next.js users, there is an official Oxlint example now!
  • The technical preview for Oxlint's custom JavaScript plugins will be released in the next week. Stay tuned! A sneak peek can be found in the tweet below.

Upcoming Events

Want to catch talks and presentations from VoidZero team members? Then take a look at the following events where they will be sharing insights and giving talks:

  • Oct 2: JetBrains JavaScript Day. Want to know more about faster builds with fewer headaches? Then join Alex's talk.
  • Oct 9–10: ViteConf 2025. The first in-person ViteConf with talks from many of the VoidZero team!
  • Oct 14–16: JSConf North America. Evan shares insights on overcoming challenges in building faster JavaScript tooling.
  • Oct 25: VueFes Japan. Evan gives the event's keynote, Hiroshi prepared a deep dive into Vitest, and Yuji shares his OSS journey with Oxc.
  • Nov 20: JSNation. Meet Alex remotely to catch the news about Rolldown and Oxc.
  • Nov 28: React Advanced London. Wondering what our tooling has to offer as a React developer? Time to find out in Alex's remote talk!
  • Missed SquiggleConf this year? Check out Alex's talk about Rolldown.

From the Community

]]>
<![CDATA[What’s New in ViteLand: August 2025 Recap]]> https://voidzero.dev/posts/whats-new-aug-2025 https://voidzero.dev/posts/whats-new-aug-2025 Mon, 01 Sep 2025 00:00:00 GMT Welcome to another edition of What’s new in ViteLand! Every month, we recap the project updates for Vite, Vitest, Oxc, Rolldown and what’s happening in our community.

Oxlint: Type-Aware Linting & Custom JS Plugins

Oxlint is meant to be a full-fledged linting replacement that runs at native speed. In other words, it has to cover existing linting rules, plugins, and use cases. Otherwise, users will need both Oxlint and another linter. This month, Oxlint made 2 big progress updates towards being a comprehensive replacement:

  1. Type-aware linting released. Type-aware linting has been a big feature gap for native linters because they require reading multiple files to infer type, which negates performance gains. However, Oxlint was able to maintain fast performance by building on TypeScript’s native Go port and tsgolint. The official preview of Oxlint’s type-aware linting supports 40 type-aware rules, including no-floating-promises.

  2. Custom JS plugins support roadmap. Oxlint’s custom JS plugin support is a "have my cake and eat it too" solution that provides an ESLint-compatible API and fast performance. After months of researching and prototyping, the team has found a way to run existing ESLint plugins from NPM and offer an ESLint-compatible API for custom rules and plugins. In the future, almost all ESLint plugins will work with Oxlint without modification, while maintaining the strong performance characteristics that Oxlint is known for.

Project Updates

Vite

  • React Server Component support in Vite has landed via @vitejs/plugin-rsc. The goal is to offer a unified solution for every vite-based React framework.
  • @vitejs/plugin-react version 5 has been released. It now integrates @vitejs/plugin-react-oxc directly when rolldown-vite is detected, so no different plugin is needed anymore.
  • Nobody wants their source code leaked or stolen. Due to a dev server vulnerability in various tools (including Vite), this was a real threat. Read Sapphi's retrospective blog post and find out more of the process of responding it and fixing it in the whole JS ecosystem.
  • Plugin Hooks for vite-plugin-pwa (and other Vite plugins) are in place now, speeding them up when using rolldown-vite.

Vitest

Rolldown

  • Rolldown-Vite enables native plugins out of the box. After improving them behind a native flag, and resolving all ecosystem-ci issues, the first set is considered stable enough to be enabled by default, giving all builds a speed boost without the need for any configuration.
  • Dead Code Elimination and treeshaking is key for a small bundle. In the recent Rolldown versions, multiple improvements have been made to keep your bundle size even lower.
    • Including the inlineConst feature, which inlines imported constant values during bundling (instead of referencing them). It reduces bundle size and improves runtime performance due to fewer variable lookups. This optimization will be applied by default from version 1.0.0-beta.35 on.
  • Rolldown now has a top-level tsconfig option. You can point it to your project's tsconfig path, allowing the resolver to respect aliases from compilerOptions.paths and setting defaults for transform settings. This supersedes the previously introduced resolve.tsconfigFilename option.
  • Our first case study is out: Read how PLAID Inc. moved to Rolldown and decreased their build times by 97%

Oxc

  • Not only the Rolldown team worked on ensuring smaller bundles. Oxc's minifier now runs dead code elimination multiple times, similar to Rollup. This can reduce the bundle size even further, while only adding minimal overhead.
  • If you are using React and styled-components, your builds can become even faster as Oxc now supports most of its features as native transform. It can be easily enabled in Rolldown too as this example shows.
  • Working on the performance of tsgolint can benefit everyone! Team member Cameron sent multiple PRs to the typescript-go repository to improve its performance in various cases, helping the whole ecosystem.

Upcoming Events

Stay tuned for these exciting events where VoidZero team members will be sharing insights and giving talks:

  • Sep 10: Vue Paris. Alex presents on Oxc and Rolldown at the community meetup.
  • Sep 18–19: SquiggleConf 2025. VoidZero is sponsoring, and Alex speaks about Rolldown.
    Win a ticket until Sep 5th by trying out Oxlint!
  • Sep 23: PragVue. Alex gives the keynote and talks about modern tooling.
  • Oct 2: JetBrains JavaScript Day. Want to know more about faster builds with fewer headaches? Then join Alex's talk.
  • Oct 9–10: ViteConf 2025. The first in-person ViteConf with talks from many of the VoidZero team!
  • Oct 14–16: JSConf North America. Evan shares insights on overcoming challenges in building faster JavaScript tooling.
  • Oct 25: VueFes Japan. Evan gives the event's keynote, Hiroshi prepared a deep dive into Vitest, and Yuji shares his OSS journey with Oxc.

From the Community

]]>
<![CDATA[How PLAID Cut Build Times by 97% Migrating From Rollup To Rolldown]]> https://voidzero.dev/posts/case-study-plaid-rolldown https://voidzero.dev/posts/case-study-plaid-rolldown Mon, 25 Aug 2025 00:00:00 GMT

For more details on the PLAID developer experience team’s Rolldown implementation, see the blog post on PLAID's website.

TL;DR:

  • PLAID’s server-side bundling dropped 97% during benchmarking after switching from Rollup + Terser to Rolldown + Oxc-minify
  • Rollup’s single-threaded JS architecture was limiting performance and improvement attempts like caching were ineffective
  • Rolldown’s Rust architecture directly addressed Rollup’s constraints by enabling parallel processing and accelerating bundler operation
  • Average production wait times decreased from 5-20 seconds to 200ms
]]>

For more details on the PLAID developer experience team’s Rolldown implementation, see the blog post on PLAID's website.

TL;DR:

  • PLAID’s server-side bundling dropped 97% during benchmarking after switching from Rollup + Terser to Rolldown + Oxc-minify
  • Rollup’s single-threaded JS architecture was limiting performance and improvement attempts like caching were ineffective
  • Rolldown’s Rust architecture directly addressed Rollup’s constraints by enabling parallel processing and accelerating bundler operation
  • Average production wait times decreased from 5-20 seconds to 200ms

PLAID, Inc. is a Japan-based company that developed KARTE, a customer experience (CX) platform. It enables businesses to create web campaigns, personalize communications, and improve overall customer experience.

KARTE features a no-code editor called “Flex Editor”, which lets the campaign manager customize pop-up campaigns on their website without coding. Pop-up changes are stored in the database and deployed to the campaign manager’s website. The latest pop-up version is loaded when a user visits the website.

Karte’s Flex Editor
Karte’s Flex Editor

The challenge: Slow server-side bundling

The Flex Editor’s biggest issue was performance. The same Rollup-powered compiler bundles the pop-up code in two separate cases:

  1. When the campaign manager made changes in the Flex Editor
  2. When the campaign manager deployed changes to their website

The PLAID developer experience team made client-side bundling optimizations to the Rollup-powered compiler for a snappy Flex Editor preview experience. However, the same optimizations could not be used for the pop-up’s server-side bundling because of different requirements. For example, Flex Editor client-side bundling had minification disabled because the code was already on the client.

The Rollup-powered compiler could take 5 - 20 seconds to server-side bundle before deploying the pop-up code. In other words, the campaign manager might be watching a deploying spinner for up to 20 seconds while their changes deploy! A big difference from Core Web Vitals recommended <200ms interaction delays.

The solution: Rolldown’s Rust architecture

The PLAID developer experience team first tried caching to improve Rollup’s performance. Rollup’s moduleParsed hook was used to cache the AST each time a module was fully parsed. However, response times remained 5 - 20 seconds even with caching. It turns out campaign managers were frequently clicking save or using the Ctrl+S shortcut, which starts the server-side bundling process and bypasses the cached AST.

Another solution was needed. Rollup’s single-threaded JavaScript architecture was the performance bottleneck. The bundler was not designed with parallel bundling in mind. CPU-intensive operations, like minification, further degraded performance.

The PLAID developer experience team started exploring alternatives to Rollup for server-side bundling. Rolldown emerged as a possible solution because it specifically addressed Rollup’s performance challenges. Rolldown’s Rust architecture enables parallel processing and accelerates bundler operations like parsing JavaScript/TypeScript and removing dead code.

"I verified Rolldown’s bundling behavior in my OSS projects, confirming proper operation and faster bundling than Rollup"

Kazupon, Engineer, PLAID

For most projects, migrating from Rollup to Rolldown is a straightforward, drop-in replacement. Rolldown maintains API compatibility with the major existing Rollup plugins for easy migration. However, KARTE uses the same Rollup-powered compiler for both the client and server. They wanted to keep Rollup for the client-side bundling and continue to use a single source, avoiding code complexity and higher maintenance load.

The team accomplished this with a plugin that replaces import { rollup } from 'rollup' with import { rolldown as rollup } from 'rolldown' based on the environment. The custom switching logic eliminated the need for Rolldown-specific code in the compiler and worked because of Rolldown’s compatibility with Rollup.

The results: 97% faster build times

After switching from Rollup to Rolldown, the build times improved dramatically. Switching from Rollup + Terser to Rolldown + Oxc-minify dropped the average build time from 1150ms to 40ms, or a 97% decrease, during benchmarking.

However, the size of the bundle increased when using Rolldown + Oxc-minify. The team had to consider trade-offs. Their primary goal was to improve user experience. Even though the bundle would be slightly larger, the improved speed and reduced infrastructure costs drove the decision to adopt Rolldown.

The results were consistent in production. The previous 5 to 20 second waiting time dropped to less than 200ms, which complies with Core Web Vitals recommended 200ms interaction delays.

Next steps

Rolldown is in beta, and the PLAID developer experience team continues to monitor its progress and make updates. Improvements to Rolldown + Oxc-minify have already been made, which reduced bundle sizes.

In addition to using Rolldown in their compiler, the PLAID developer experience team replaced Vite with Rolldown-Vite in the Flex Editor project. Frontend build time got 16x faster!

]]>
<![CDATA[Announcing Oxlint Type-Aware Linting]]> https://voidzero.dev/posts/announcing-oxlint-type-aware-linting https://voidzero.dev/posts/announcing-oxlint-type-aware-linting Fri, 22 Aug 2025 00:00:00 GMT

For more technical details, implementations, and considerations for Oxlint's type-aware linting, see the blog post on the Oxc website.

TL;DR: Oxlint with type-aware linting is now available and supports 40 long-awaited rules including no-floating-promises. Oxlint uses tsgolint, which @auvred initially prototyped as typescript-eslint/tsgolint and generously offered to continue its development under the Oxc organization.

]]>

For more technical details, implementations, and considerations for Oxlint's type-aware linting, see the blog post on the Oxc website.

TL;DR: Oxlint with type-aware linting is now available and supports 40 long-awaited rules including no-floating-promises. Oxlint uses tsgolint, which @auvred initially prototyped as typescript-eslint/tsgolint and generously offered to continue its development under the Oxc organization.


The VoidZero team is excited to announce that oxlint has taken a big step towards being a full-fledged Rust-based linting replacement with type-aware linting. This release unlocks 40 long-awaited rules including no-floating-promises.

Unlocking type-aware linting

Type-aware linting has been one of the biggest feature gaps for native linters like oxlint. Traditional lint rules only review one file at a time, which is fast and parallelizable. However, type-aware lint rules call TypeScript’s types API to infer types, which may require reviewing every file. As a result, type-aware lint rules are more capable than traditional lint rules, but slower.

oxlint’s first approach to solving type-aware linting’s slow performance was implementing its own type-checker. After multiple attempts, this approach was abandoned because maintaining a type-checker on par with a fast-moving target like TypeScript is not feasible.

Other approaches, like TypeScript inter-process communication, were also considered and abandoned. The key to unlocking type-aware lint rules turned out to be TypeScripts native Go port and tsgolint.

Building on tsgolint

The tsgolint project is an experimental type-aware linter written in Go and initially prototyped by @auvred as typescript-eslint/tsgolint. However, the typescript-eslint team decided not to allocate development resources to this prototype, as they plan to continue their work on typescript-eslint for typed linting with ESLint.

The VoidZero team contacted @auvred for a forked, scoped-down version adapted for oxlint. In a true “standing on the shoulder of giants” moment, @auvred generously offered to continue its development under oxc-project/tsgolint.

Performance

Initial oxlint with type-aware linting tests show repositories that previously took 1 minute to run with typescript-eslint now finish in <10 seconds.

Using projects from oxlint-ecosystem-ci:

Project Files Time
napi-rs 144 1.0s
preact 245 2.7s
rolldown 314 1.5s
bluesky 1152 7.0s

Next steps

The VoidZero team will continue to develop and improve upon tsgolint. For v1.0 release, we will:

  • Address performance issue for large monorepos
  • Add the ability to configure individual rules
  • Further validate correctness of each individual rules
  • Add IDE support
  • Ensure overall stability

Acknowledgements

The VoidZero team would like to extend our gratitude to:

  • The TypeScript team for creating typescript-go.
  • The typescript-eslint team for their heartwarming support.
  • @auvred for creating tsgolint.
  • @camchenry for the oxlint + tsgolint integration.
  • @camc314 for work on performance issues.

Join the community

The VoidZero team would love to hear your feedback on oxlint and type-aware linting, and are excited to see how it helps improve your development workflow.

Connect with us:

]]>
<![CDATA[What’s New in ViteLand: July 2025 Recap]]> https://voidzero.dev/posts/whats-new-jul-2025 https://voidzero.dev/posts/whats-new-jul-2025 Tue, 05 Aug 2025 00:00:00 GMT Welcome to another edition of What’s new in ViteLand! Every month, we recap the project updates for Vite, Vitest, Oxc, Rolldown and what’s happening in our community.

Vite+ at Vue China
Evan You talking about the big picture and hinting Vite+ at VueConf China

What’s Vite+?

The Vite Ecosystem will gather in Amsterdam on October 9-10 for the first-ever in-person ViteConf. We had 3 incredible ViteConf online editions, and we can’t wait to meet in real life.

It’s also where we’ll be dropping all the details about Vite+. Be the first to discover what it is and how it can improve your team’s DX.

There’s a packed lineup of speakers, including Eric Simons, CEO of Bolt.new/StackBlitz, and Mathias Biilmann, CEO of Netlify. We’ll be discussing the future of web development through topics like next-generation tooling, agent experience, and more.

Not to mention the world premiere of CultRepo’s Vite Documentary. The never-before-told backstory of Vite. Featuring the Vite Team and authors of Svelte, Solid, Astro, and more. If you want a sneak peek, check out the trailer that just got released.

Stay tuned.

Project Updates

Vite

  • Vite 7 is out with continued Environment API developments, including the new buildApp hook to let plugins coordinate the building of environments. Vite also bumps up their required Node version, as Node.js 18 has reached EOL. This also means that Vite is now shipped as ESM-only package. So, no excuse not to do the same for your own libraries!
  • Are you on the Vite Land Discord already? If not, you should join now and get your official Vite Discord server tag and mingle with like-minded devs.
  • For the first time in history, Vite's weekly downloads surpassed the one's of Webpack. Who would've expected that 5 years ago when Vite started as a side project?

Rolldown

  • Rolldown-Vite supports tsconfig path resolution out of the box. Use the path resolutions defined in your tsconfig by setting resolve.tsconfigPaths. No extra plugin needed anymore!
  • Using Yarn Plug-and-Play? Then you'll be happy to hear that Rolldown supports it out of the box now! No need to use nodeLinker: node-modules anymore.
  • vite-plugin-node-polyfills got an update which ensures that it will use Rolldown's built-in features if it runs via rolldown-vite, yielding better performance while not compromising on compatibility.
  • Rolldown now allows you to transform your top level variables to var. Sounds like legacy code? Not really. This can lead to better performance if you have a lot of top-level variables that are not used until later in the code. The downside: It doesn't work well with circular references, which is why it is opt-in.
  • Can't get enough of the speed improvements? Thanks to some in-depth JavaScript engine optimization trickery, Rolldown's startup time decreased by 2.1x. This leads to a faster Time to Interactive, better cold-starts for serverless and also benefits your dev server startup speed.

Oxc

  • Type-aware linting comes to Oxlint! The result of collaboration between the Oxc team and @auvred from the typescript-eslint team. Type-aware rules, such as no-floating-promise can be fully covered and integrated like other Oxlint rules for a great DX. The upcoming first version will include two rules to start with. The following next version will then include all type-aware rules from typescript-eslint. The best news: No slowdown was experienced when it was tested within large repositories such as the VS Code repo.
  • JS custom rules, with an API that aims to be ESLint compatible. We invested heavily into the underlying implementation to make it fast without the typical data-passing drawback of JS-in-native plugins. The first prototype looks promising, but more testing is needed before we can release it.
  • Multiple minor versions of Oxlint were released. These include newly ported rules, additional auto-fixers, dozens of bug fixes and the ground work for custom JS plugins as mentioned above.

Vitest

  • Visual Regression Testing is now available in the latest Vitest beta! This means you can now write tests that compare screenshots of your components, ensuring that visual changes are intentional and not accidental without any extra tools! This wouldn't be possible without Vitest's browser mode.
  • Talking about @vitest/browser: The experimental feature reached the milestone of 1 million weekly NPM downloads, while simplifying developer setups around the world. Time to make it stable? Soon™

From the Community

]]>
<![CDATA[Announcing Oxlint 1.0]]> https://voidzero.dev/posts/announcing-oxlint-1-stable https://voidzero.dev/posts/announcing-oxlint-1-stable Tue, 10 Jun 2025 00:00:00 GMT TL;DR: The first stable version Oxlint has been released! With a 50~100x performance improvement over ESLint, support for over 500 ESLint rules, and usage in major companies like Shopify, Airbnb, and Mercedes-Benz, you should give it a try. Get started now.

]]>
TL;DR: The first stable version Oxlint has been released! With a 50~100x performance improvement over ESLint, support for over 500 ESLint rules, and usage in major companies like Shopify, Airbnb, and Mercedes-Benz, you should give it a try. Get started now.


Oxlint is a Rust-powered linter for JavaScript and TypeScript is designed to be fast and simple to adopt. Since its first announcement back in December 2023, Oxlint has undergone significant improvements and is now shipping its first stable version, 1.0. In addition to the stable release, we also want to announce that Oxlint has a dedicated full-time maintainer, Cameron, and a growing core team working on maintaining and improving the linter.

Real-World Impact

We are extremely proud of the performance of Oxlint and its impact on real, large-scale codebases, which has led to reduced CI costs.

We are thankful for our 5,200 early adopters and for companies and projects such as:

  • Shopify, where the front-end platform team uses Oxlint in the Shopify admin console.
  • Airbnb, where they use multi-file analysis oxc/no-barrel-file and import/no-cycle on their 126,000+ files, which completes in 7s on CI. ESLint's implementation of these rules times out.
  • Mercedes-Benz, where they observed a 71% decrease in lint time when swapping ESLint to Oxlint, with some project seeing up to a 97% speedup.
  • Large Open Source projects, from runtimes like Bun to frameworks like Preact.

On the largest repository we found, Oxlint reported:

Finished in 22.5s on 264925 files with 101 rules using 10 threads.

Based on real-world cases posted on X and Bluesky, Oxlint runs at approximately 10,000 files per second, depending on the total number of threads used.

Quick Start

Oxlint is perfect for developers who want to start linting their code without spending hours configuring tools. With zero setup required, you can start catching issues immediately:

Run it, no config required.

sh
$ npx oxlint@latest
sh
$ pnpm dlx oxlint@latest
sh
$ yarn dlx oxlint@latest
sh
$ bunx oxlint@latest
sh
$ deno run npm:oxlint@latest

While no setup or configuration is needed, Oxlint is configurable via an .oxlintrc.json file, which is useful for larger projects or projects that require more customization. This configuration format is based on ESLint v8’s flat config, making migration easy and familiar. Each source file is linted with the nearest applicable configuration, and you can use overrides to target specific glob patterns. You can also extend shared configs to keep teams consistent.

For projects already using ESLint, oxlint-migrate can be used to migrate an existing ESLint flat-config file to Oxlint. Additionally, eslint-plugin-oxlint can disable overlapping ESLint rules while both linters are used together. It is recommended to run oxlint && eslint to benefit from Oxlint's faster feedback cycle.

For more detailed instructions on how to use Oxlint and integrate it with your project or editor, check out the installation guide.

Versioning

Unlike libraries that ship runtime code, a linter only changes the diagnostics it returns. Oxlint adheres to semantic versioning:

  • Patch releases: Bug fixes only.
  • Minor releases: Expand rule coverage and diagnostics, without requiring configuration changes.
  • Major releases: CLI or configuration changes, that may require migration. Note that minor releases can still break your CI if newly added rules uncover previously hidden issues. Learn more in our Versioning guide.

Highlights

Comprehensive Rule Coverage

Oxlint includes over 500 rules from various sources:

  • Complete ESLint rule set, including TypeScript-specific rules from typescript-eslint (excluding type checked rules).
  • Popular plugin rules from eslint-plugin-unicorn, eslint-plugin-jsdoc, eslint-plugin-react, eslint-plugin-react-hooks, eslint-plugin-jest, and eslint-plugin-import
  • Unique Oxlint rules like bad comparison sequence, const comparisons, and only used in recursion

Flexible Configuration

Configure Oxlint through .oxlintrc.json files with support for:

  • Nested configurations that apply to specific directories
  • Override patterns for targeting specific file types or locations
  • Shared configuration extending for team consistency

Editor Integration

First-class editor support with extensions for:

Helpful Diagnostics

Oxlint is built to deliver clear, actionable error messages - not just describing the issue, but visualizing it and suggesting how to fix it.

CLI Demo Oxlint running in the terminal with detailed error reporting

Benchmark

Our benchmark reveals that Oxlint is around 50~100 times faster than ESLint with the same setup.

Tool Time
oxlint (multi thread) 615.3 ms
oxlint (single thread) 1.840 s
eslint 33.481 s

Roadmap

Oxlint 1.0 is just the beginning! While it is stable, we still have important features and improvements planned for future releases:

Custom Rules – JavaScript plugin support is coming soon, enabling teams to write custom rules that integrate seamlessly with Oxlint's architecture.

Performance Optimizations – Continued improvements to parsing and analysis speed.

Fine-grained (per-glob) configuration - ESLint v9 configuration

Acknowledgements

Oxlint 1.0 represents the collective effort of over 200 contributors who have shaped this project. We're grateful for every bug report, feature request, and code contribution.

Special recognition goes to:

  • @branchseer for implementing the multi-file analysis runtime.
  • @camc314, @mysteryven, and @shulaoda for implementing many sophisticated lint rules, testing, and constantly improving everything.
  • @camchenry for implementing nested configuration support.
  • @DonIsaac for improving configuration, documentation and website, and for representing Oxc at SquiggleConf 2024.
  • @leaysgur for the RegExp parser and JSDoc plugin.
  • @Sysix for maintaining eslint-plugin-oxlint and significant contributions to the language server and VSCode extension.
  • @u9g and @rzvxa for implementing control flow graph analysis.

Join the Community

We'd love to hear your feedback on Oxlint and are excited to see how it helps improve your development workflow. Connect with us:

Your feedback drives Oxlint's evolution.

Give It a Try

To get started, follow the installation guide, or learn more about the Oxc project.

]]>
<![CDATA[Announcing Rolldown-Vite]]> https://voidzero.dev/posts/announcing-rolldown-vite https://voidzero.dev/posts/announcing-rolldown-vite Fri, 30 May 2025 00:00:00 GMT TL;DR: Try out the Rolldown-powered Vite today by using the rolldown-vite package instead of the default vite package. It is a drop-in replacement, as Rolldown will become the default bundler for Vite in the future. Switching should reduce your build time, especially for larger projects. Reach out to discover production use cases!

]]>
TL;DR: Try out the Rolldown-powered Vite today by using the rolldown-vite package instead of the default vite package. It is a drop-in replacement, as Rolldown will become the default bundler for Vite in the future. Switching should reduce your build time, especially for larger projects. Reach out to discover production use cases!


Over the last year, we've been working on Rolldown, a Rust-based next-generation bundler, as part of a broader effort to modernize Vite's core. Alongside Rolldown, we've developed Oxc, a collection of high-performance JavaScript tools, that includes a parser, transformer, resolver, and minifier, as well as a linter and soon also a formatter. It acts as foundational layer for Rolldown, providing the necessary building blocks for efficient JavaScript and TypeScript processing.

Today, we're excited to announce that the Rolldown-powered Vite version has reached initial feature parity with today's Vite. This means you can try it as a drop-in replacement and experience the benefits right away as part of a technical preview.

Thanks to early adopters, we were able to test rolldown-vite with a range of projects, from basic setups to large-scale enterprise apps. The results have been impressive, with production build time reductions from 3x to 16x, and memory usage during the build process cut by up to 100x. Take a look at the real-world impact section for more details.

We encourage you to try out rolldown-vite and share your feedback to contribute to the future development of Vite's bundling infrastructure.

Using rolldown-vite

Getting started with rolldown-vite is straightforward. If you have an existing Vite project, you can replace the vite package with rolldown-vite by using an alias in your package.json:

json
{
  "dependencies": {
    "vite": "npm:rolldown-vite@latest"
  }
}

If you use VitePress, or a meta-framework that lists Vite as a peer dependency, you can use overrides to replace the vite package with rolldown-vite in your project:

json
{
  "overrides": {
    "vite": "npm:rolldown-vite@latest"
  }
}
json
{
  "pnpm": {
    "overrides": {
      "vite": "npm:rolldown-vite@latest"
    }
  }
}
json
{
  "resolutions": {
    "vite": "npm:rolldown-vite@latest"
  }
}
json
{
  "overrides": {
    "vite": "npm:rolldown-vite@latest"
  }
}

That's it! Now you can continue using Vite as you normally would, but with the added performance benefits of Rolldown. It's likely that you will see some warning messages about not-yet-supported options or deprecated APIs - we will continue to smooth those out across the ecosystem.

rolldown-vite is currently distributed as a separate package to allow for rapid iteration and to keep feedback and issues separate from the main vite package. This separation helps ensure stability for existing users while the Rolldown integration matures. Once rolldown-vite is stable, its changes will be merged into Vite and the separate package will be deprecated.

It also follows Vite's major and minor version numbers to maintain compatibility, but its patch versions are independent and may introduce breaking changes as development continues. For the latest updates and details, refer to the rolldown-vite changelog.


Ensuring Compatibility

Compatibility is a top priority for rolldown-vite. To ensure a smooth experience, we created a forked version of Vite ecosystem CI and ran it against rolldown-vite. We have got the tests passing for most frameworks and plugins, but do note some frameworks or advanced use cases may still have compatibility gaps. We also recommend checking the Rolldown migration guide for the latest compatibility notes, known issues, and migration tips. If you run into problems, please report them so we can improve support across the ecosystem.

esbuild as optional dependency

In the current stable version of Vite, esbuild is a core dependency used for tasks like transforming and minifying production builds, as well as powering parts of the development server. Many Vite plugins also rely on esbuild through utility functions provided by Vite, such as file transformations.

With rolldown-vite, esbuild is no longer required. Instead, all internal transformations and minification are handled by Oxc, leading to improved performance using a single foundational layer. This means you don't need to install esbuild as a dependency unless you're using a Vite plugin that explicitly requires it and doesn't yet support Oxc transforms.

We are actively collaborating with plugin and framework authors to ensure that Vite plugins automatically leverage Oxc transforms when using rolldown-vite, resulting in faster builds.

Plugin Authors

If you are a plugin author, no matter if it is a Vite or Rollup plugin, you can start testing your plugins with rolldown-vite right away. A lot of plugins should work out of the box, but some might need tweaking, either for compatibility or performance reasons. Consult our plugin author guide for more details.


Real-World Impact

rolldown-vite is still in development, but early adopters, ranging from side projects to large-scale enterprise apps, are already seeing remarkable results.

Some highlights:

  • GitLab reduced build time from 2.5 minutes to just 40 seconds and cut their memory usage by 100x.
  • Excalidraw's build dropped from 22.9 seconds to 1.4 seconds (16x faster).
  • PLAID Inc. saw one frontend's build time fall from 1 minute 20 seconds to 5 seconds (16x faster).
  • Appwrite builds went from over 12 minutes to just 3 minutes, with memory usage being slashed by 4x.
  • Particl achieved a nearly 10x speedup compared to Vite (and almost 29x compared to Next.js), with builds dropping from over a minute to just 6 seconds.

These results show not only much faster builds, but in some cases, orders of magnitude less memory usage. For more details or to share your own results, visit the vitejs/rolldown-vite-perf-wins repository.

Oh and fun fact - this blog post you are reading right now is built with VitePress running on top of Rolldown-Vite, and the production build takes only 1.8s on Netlify.


What Comes Next?

Vite is generally known for its unbundled native ESM dev server, which is responsible for the fast startup time and almost instant feedback. Nevertheless, we’ve seen limitations of this approach for projects at unconventional scale, especially in Enterprise setups. To address these, we are working on a full-bundle mode for the dev server. With Rolldown’s performance, this mode aims to improve dev server startup times, especially for large projects, while maintaining or even enhancing startup speed for small and medium projects.

Alongside this, we plan to "rustify" more of Vite’s internals to reduce communication overhead and unlock even greater performance gains.

Roadmap for Rolldown in Vite

We've planned three phases for the transition to Rolldown, each designed to ensure a smooth migration while gathering valuable feedback from the community:

  1. Phase One (Current): rolldown-vite is available as a separate package for early adopters. This allows us to gather feedback and make improvements based on real-world usage.
  2. Phase Two: All changes from rolldown-vite will be merged into the main Vite codebase when considered stable. They will also come with an opt-in full-bundle mode for development. rolldown-vite will be deprecated at this point.
  3. Phase Three: The full-bundle mode will become the default for Vite.

At the current time of writing, we expect each phase to last several months. Keep in mind that the exact duration will be determined by various factors, most importantly feedback from the community and real world usage, as well as stability and compatibility.


Give It a Try

We encourage you to try rolldown-vite in your projects today! Your feedback will help shape the future of Vite's bundling infrastructure. In case you encounter any issues, such as missing or broken features, unclear error messages or performance degradations, please make sure to report these in the rolldown-vite repository (not the main Vite repo). If you want to pose questions or discuss in real time, make sure to join the Rolldown Discord.

]]>
<![CDATA[VoidZero and NuxtLabs join forces on Vite Devtools]]> https://voidzero.dev/posts/voidzero-nuxtlabs-vite-devtools https://voidzero.dev/posts/voidzero-nuxtlabs-vite-devtools Thu, 03 Apr 2025 00:00:00 GMT TL;DR: Through our partnership with NuxtLabs, Anthony Fu will work on creating Vite DevTools, a tool that will offer deeper and more insightful debugging and analysis for all projects and frameworks built on top of Vite.

]]>
TL;DR: Through our partnership with NuxtLabs, Anthony Fu will work on creating Vite DevTools, a tool that will offer deeper and more insightful debugging and analysis for all projects and frameworks built on top of Vite.


At VoidZero, we place a strong emphasis on developer experience. While Vite and Rolldown already deliver outstanding performance and modern features, we believe that transparency in the bundling process is just as essential. By making this process easy to inspect and understand, we aim to empower developers with the knowledge and tools they need to optimize their applications and enhance the overall end-user experience.

To achieve that, VoidZero is partnering with NuxtLabs to build Vite DevTools - a powerful debugging and analysis toolkit aimed at improving both developer workflows and application performance. Designed for universal compatibility across all frameworks built on Vite, Vite DevTools will deliver deeper visibility into the build process, helping developers produce leaner, more efficient frontend bundles that result in faster, more seamless experiences for end users.

As part of this collaboration, Anthony Fu — who previously created Nuxt DevTools and vite-plugin-inspect - will be working part-time on Vite DevTools. vite-plugin-inspect is a powerful tool that allows developers to inspect the internal workings of Vite, providing insights into how their applications are being built and optimized. Anthony's extensive experience in the Vite ecosystem, including his work on multiple devtools, will be instrumental in advancing this project.

Vite DevTools aims to deliver a rich set of features that provide actionable insights. Initial capabilities under consideration include:

  • Plugin pipeline inspection (adapted from vite-plugin-inspect)
  • Bundle analysis, leveraging Rolldown's metadata
    • Tree-shaking visualization
    • Barrel files detection
    • CJS/ESM adoption visualization
  • Visualization of Vite environment configurations
  • Actionable suggestions to improve bundle size and front-end performance
  • Time-based bundle reporting
  • Availability during both development and build phases via a command-line flag or interface

We invite the community to follow the progress and contribute ideas via this discussion.

We’re thrilled to collaborate with NuxtLabs and Anthony to push the boundaries of what modern tooling can offer, and we look forward to making Vite DevTools an indispensable part of every developer's workflow.

]]>
<![CDATA[VoidZero joins the Open Source Pledge]]> https://voidzero.dev/posts/oss-pledge https://voidzero.dev/posts/oss-pledge Mon, 14 Oct 2024 00:00:00 GMT TL;DR: VoidZero has joined the Open Source Pledge and is donating $3,000 per developer at VoidZero per year to external open source projects.

]]>
TL;DR: VoidZero has joined the Open Source Pledge and is donating $3,000 per developer at VoidZero per year to external open source projects.


Open source is at the heart of VoidZero. The company is founded on the open source projects we created and maintain, and our team members all have been prolific open source contributors and maintainers.

Outside of VoidZero, I have also maintained independent projects that are sustainable largely thanks to donations. From that experience a few things became clear to me:

  • For most OSS projects, donations capture a disproportionally small fraction of the value they create.
  • Donations from individual users do not scale.
  • There are cultural and bureaucratic barriers that are preventing more companies from donating to OSS.

This is why I am excited to see companies like Sentry and the others who have already joined the pledge actively working to change the norm.

Today, we are happy to announce that VoidZero is joining this effort too.

Who Are We Sponsoring?

VoidZero is already sponsoring many contributors to our projects, e.g. Vite and Oxc. For the OSS pledge, however, donations are only eligible when made to projects that are not controlled by VoidZero. We are starting with the dependencies that our projects directly rely on, and individual developers that we believe are producing meaningful impact for the entire JavaScript ecosystem. As we scale as a company in the future, we will increase our donations accordingly to cover a wider range of projects and foundations.

We are donating a total of $31,488 per year, which is $3,148 per developer given our current team size. Here is a detailed breakdown of who we are funding:

Project(s) Developer Monthly Recurring Amount
NAPI-RS LongYinan (@Brooooooklyn) $1000
Multiple Anthony Fu (@antfu)* $512
UnJS Pooya Parsa (pi0) $256
Lightning CSS Devon Govett (@devongovett) $256
Rollup ** $300
pnpm ** $150
PostCSS ** $150

*Anthony contributes to Vite and Vitest, which are projects related to VoidZero's ecosystem, but also maintains many other projects. We are sponsoring Anthony for $1024/month and counting half of that towards the pledge.

**Donations made to projects' respective Open Collective accounts instead of individuals.​


This is just the beginning of our long-term commitment to open source. We look forward to improving the process, growing the budget, and supporting more projects in the future. We also hope that this will inspire more companies to join the pledge. Let's collectively change how businesses interact with open source for the better.

]]>
<![CDATA[Announcing VoidZero - Next Generation Toolchain for JavaScript]]> https://voidzero.dev/posts/announcing-voidzero-inc https://voidzero.dev/posts/announcing-voidzero-inc Tue, 01 Oct 2024 00:00:00 GMT TL;DR: I have founded VoidZero Inc., a company dedicated to building an open-source, high-performance, and unified development toolchain for the JavaScript ecosystem. We have raised $4.6 million in seed funding, led by Accel.

]]>
TL;DR: I have founded VoidZero Inc., a company dedicated to building an open-source, high-performance, and unified development toolchain for the JavaScript ecosystem. We have raised $4.6 million in seed funding, led by Accel.


Fifteen years ago, when I started building apps with JavaScript, it was mostly a browser-based scripting language. Today, it has evolved into the most widely used language in the world, powering everything from web and mobile apps to game development and even IoT.

Over the years, many excellent tools have emerged to address the increasing scale and complexity of JavaScript applications. However, the ecosystem has always been fragmented: every application relies on a myriad of third-party dependencies, and configuring them to work well together remains one of the most daunting tasks in the development cycle.

As the author of one of the most widely used frontend frameworks, I’ve spent significant effort researching every layer of the JavaScript tooling stack, assembling hundreds of dependencies, and designing complex abstractions on top of them. The goal was always to give end users a cohesive, out-of-the-box development experience. These efforts eventually led to the creation of Vite in 2020.

Fast forward four years, Vite is now one of the most popular build tools for web development, with over 15 million downloads per week and a vast ecosystem. In addition to being the go-to choice for single-page applications built with React and Vue, Vite is also powering meta frameworks like Remix, Nuxt, Astro, SvelteKit, SolidStart, Qwik, Redwood, and more. It has clearly established itself as the shared infrastructure layer for the next generation of web frameworks.

vite download graph

Vite weekly NPM downloads since 2020

The trust the community has placed in Vite made me reflect deeply on its future. While Vite has greatly improved the high-level developer experience, internally, it still relies on various dependencies, with abstractions and workarounds to smooth over inconsistencies. Performance-wise, it remains bottlenecked by duplicated parsing and serialization costs across different tools, and it can't fully leverage native tooling like esbuild due to feature constraints and limited customizability.

We started to design a new bundler, Rolldown, tailored for Vite's needs. But as I ventured deeper into the layers beneath the bundler, I came to the realization that the challenges Vite is facing is a reflection of the JavaScript ecosystem at large: fragmentation, incompatibilities, and inefficiency. To fundamentally change this, a unified toolchain is needed.

Imagine a toolchain that is:

  • Unified: using the same AST, resolver, and module interop for all tasks (parsing, transforming, linting, formatting, bundling, minification, testing), eliminating inconsistencies and reducing redundant parsing costs.
  • High Performance: written in a compile-to-native language, designed from the ground up for speed, with maximum parallelization and low-overhead JS plugin support. The performance budget unlocks more ambitious features that improve not only developer experience, but end user experience as well.
  • Composable: each component of the toolchain is independently consumable, offering building blocks for advanced customization.
  • Runtime Agnostic: not tied to any specific JavaScript runtime—delivering the same developer experience across all environments.

Such a toolchain will not only enhance Vite but also drive significant improvements throughout the JavaScript ecosystem. This is an ambitious vision, and achieving it requires a full-time, dedicated team—something that wasn’t possible under the independent sustainability model of my past projects. This is why VoidZero was founded.

I'm excited to announce that we've raised $4.6 million in seed funding to pursue this vision. Our seed round was led by Accel, with participation from Amplify Partners, Preston-Werner Ventures, BGZ, Eric Simons (StackBlitz), Paul Copplestone (Supabase), David Cramer (Sentry), Matt Biilmann & Christian Bach (Netlify), Dafeng Guo (Strikingly), Sebastien Chopin (NuxtLabs), Johannes Schickling (Prisma), and Zeno Rocha (Resend).

The progress so far

Over the past year, we’ve built a team with deep expertise in JavaScript tooling, including creators and core contributors to widely adopted open-source projects like Vite, Vitest, Oxc, and former core contributors to Rspack.

We’ve been hard at work developing the foundational elements of our envisioned toolchain. In addition to ongoing improvements to Vite, we’ve also delivered:

  • The fastest and most spec-compliant JavaScript parser (oxc-parser), 3x faster than SWC - benchmark
  • The fastest Node.js-compatible resolver (oxc-resolver), 28x faster than enhanced-resolve - benchmark
  • The fastest TypeScript / JSX transformer (oxc-transform), 4x faster than SWC - benchmark
  • The fastest linter (oxlint), 50–100x faster than ESLint - benchmark
  • The most feature-complete test runner for web applications (Vitest)
  • The fastest bundler (Rolldown), built on top of Oxc. Faster than esbuild and all other Rust bundlers - benchmark (currently in alpha)

While it’s still early days, our open-source projects are already being used by some of the world’s leading engineering teams, including those at OpenAI (ChatGPT web client), Google, Apple, Microsoft, Visa, Shopify, Cloudflare, Atlassian, Reddit, HuggingFace, Linear, and many more.

What's next

Our primary goal for the coming months is to stabilize Rolldown and make it the unified bundler for Vite in both development and production. We have already made great progress and are aiming for an alpha release of Rolldown powered Vite later this year.

In 2025, we will continue to complete the other planned features of Oxc (minification, formatting) and gradually migrate the entire Vite ecosystem to be powered by Rolldown and Oxc. We will work closely with ecosystem partners and stakeholders to ensure a smooth transition for end users.

Everything we’ve open-sourced will remain open source. On top of our open-source projects, we will offer an end-to-end JavaScript tooling solution specifically designed to meet the scale and security requirements of enterprise environments.

Get in touch!

Follow us on X to stay updated on our progress. If you have requirements for these tools at scale, get in touch! If you're interested in contributing to or building on top of our projects, join us on our Discord servers (Vite, Vitest, Oxc, Rolldown). Finally, don't forget to tune in to ViteConf this week, where we'll share more details about our progress and future plans.


FAQs

  • What is the relationship between these open source projects and VoidZero?

    Vite and Vitest's team-based governance remain the same as before. Both core teams include members employed by multiple different organizations (VoidZero, StackBlitz, NuxtLabs, Astro). VoidZero Inc. employs / sponsors multiple core contributors to both Vite and Vitest.

    VoidZero Inc. holds the copyrights, funds the development, and controls the direction of both Oxc and Rolldown.

  • What about Vue?

    VoidZero as a business is entirely separate from Vue. Vue will continue as an independent project but will receive first-class support from the new tooling developed by VoidZero.

  • Why Oxc instead of SWC?

    Many of our team members have made non-trivial contributions to SWC in the past. Beyond raw performance benefits, Oxc has some fundamental design differences from SWC that make it a better foundation for the end-to-end toolchain we are building. We will share more technical insights on this topic in future blog posts. Stay tuned!

  • Why Rolldown instead of esbuild / Rollup?

    We need a bundler that is extremely fast, well-suited for application bundling, and fully compatible with Vite's plugin ecosystem. This is discussed in detail in the Rolldown documentation. Building Rolldown on top of Oxc also unlocks the ability to perform more AST-related tasks in parallel during the bundling phase, e.g. emitting and bundling dts with isolatedDeclarations: true.

  • Why will this be different from previous attempts to create a unified JS toolchain?

    The biggest challenge of a unified toolchain is the zero-to-one problem: it needs to gain critical mass for exponential adoption to justify continued development, but it is hard to cross the chasm before it actually fulfills the vision. VoidZero does not have this problem, because Vite is already the fastest growing toolchain in the JavaScript ecosystem.

]]>