DEV Community

This Week In JavaScript
This Week In JavaScript

Posted on

9 new JavaScript Features from TC39, Angular's official AI prompting guide, Vite 7, and more

Hello JavaScript Enthusiasts!

Welcome to a new edition of "This Week in JavaScript"!

This week, we’re tracking the latest ECMAScript proposals in motion, looking at Angular’s plans to bring LLMs into the developer workflow, unpacking the long-awaited release of Vite 7.0, and exploring V8’s cutting-edge WebAssembly optimizations. Plus, we’ve got some powerful new tools for your development workflow!


TC39 Moves 9 JavaScript Advancements Across 4 Stages

At the 108th TC39 meeting, nine new proposals moved through the four official standardization stages — showing how fast JavaScript is evolving across performance, safety, and usability.

Stage 1 explores early ideas, Stage 2 defines their shape, Stage 3 signals implementation readiness, and Stage 4 finalizes them for the JavaScript standard.

  • Stage 4 now includes three finalized features. First is Explicit Resource Management, which introduces using and await using—declarations that automatically clean up resources like file handles or streams when their block ends. Inspired by patterns from C# and Python, this brings safer and more predictable memory handling to JavaScript. Alongside this, Array.fromAsync offers a clean way to collect async iterable values into an array by returning a promise, making asynchronous data handling more intuitive. Finally, Error.isError lets you reliably check if a value is truly an error object, even across realms or subclassed types—filling a long-standing gap in error-checking behavior. All three are already supported in Chrome 134, Node 22, and Deno 2+.

  • Stage 3 brings Immutable ArrayBuffer, a proposal that introduces two methods—transferToImmutable() and sliceToImmutable()—which allow binary data to be frozen and safely shared between threads or runtimes. Once a buffer is made immutable, it can’t be detached or modified, which helps prevent accidental changes and improves performance in scenarios like streaming, file writing, or HTTP responses.

  • Stage 2 focuses on predictability and precision. Random.Seeded lets developers create deterministic sequences of random values using a seed, solving long-standing issues in reproducible simulations, tests, and procedural content generation. Meanwhile, Number.prototype.clamp adds a simple and expressive way to restrict any number within a defined min and max range, avoiding the verbosity of Math.min(Math.max(...)) patterns.

  • Stage 1 introduces early drafts of three forward-looking ideas. Keep Trailing Zeros adds fine-grained control over how formatted numbers display trailing decimal places—useful in financial apps or when consistency matters. Comparisons proposes a standardized, cross-environment way to print JavaScript values, making test outputs and debugging logs more uniform. Lastly, the proposed Random namespace would unify a range of utilities—like Random.int, Random.sample, Random.shuffle, and more—into a consistent, safer API for generating random numbers and selecting data.

These updates signal a thoughtful evolution of JavaScript balancing power, safety, and simplicity while giving developers better tools to write cleaner, more predictable code.


2. Angular’s Official AI Prompting Standards Are Here

As AI-generated code becomes more common, frameworks like Angular are stepping up to guide it in the right direction. Angular’s team has now introduced a dedicated set of LLM instructions, prompt templates, and context files to help large language models generate code that actually follows Angular best practices.

  • It starts with the basics: Angular now provides a system prompt that defines what good Angular code should look like. This includes TypeScript best practices like avoiding any, using strict type checking, and preferring type inference. On the Angular side, the prompt encourages standalone components over NgModules, NgOptimizedImage for image performance, and signals for local state management. Even in templates, native control flow constructs like @ifand @forare recommended over their structural directive equivalents. There’s also a push to avoid common anti-patterns like ngClass and ngStyle, instead using class and style bindings directly.

  • Beyond prompts, Angular now supports rules files tailored for specific editors and AI IDEs — including Copilot, Cursor, Firebase Studio, JetBrains IDEs, and even Windsurf. These rules help fine-tune LLM behavior inside your development environment by setting guardrails that align with Angular’s evolving standards.

  • And to tie it all together, the Angular team is experimenting with llms.txt, a proposed web standard designed to expose structured resources that LLMs can use to better understand a site or framework. Think of it like robots.txt, but for AI models instead of crawlers. There’s a base version that points to prompt files and references, and an extended llms-full.txt with deeper documentation on Angular’s inner workings.

Taken together, these changes don’t just help LLMs write better Angular — they also signal that AI-assisted coding is being treated seriously by framework maintainers.


Vite 7 Is Out With Major Internal Upgrades

Vite 7 marks a major milestone in the evolution of modern frontend tooling. Over the past five years, Vite has grown into a foundational part of the JavaScript ecosystem, powering many frameworks and tools. With over 31 million downloads per week — up by 14 million in just seven months — the growth reflects its central role in enabling faster, modular frontend development.

  • One of the most impactful changes in this release is the introduction of a new Rust-based bundler, now available as a drop-in replacement through a separate package. Built to eventually become the default, this bundler significantly improves build performance, especially for large-scale applications. It’s part of a broader effort to modernize the internal architecture and push toward a faster, more efficient toolchain.

  • Vite 7 also updates its browser target defaults to a new standard called baseline-widely-available. This shift ensures better compatibility with features that have been supported across major browsers for at least 30 months. With this update, minimum versions are now Chrome 107, Firefox 104, Safari 16.0, and Edge 107 — offering more predictability for future browser support.

  • On the Node.js compatibility side, Vite now requires Node 20.19 or 22.12 and drops support for Node 18, which has reached its end of life. These newer versions allow ESM-only distribution while maintaining interoperability with CommonJS modules, simplifying API usage and resolving long-standing compatibility issues.

  • The experimental Environment API continues to evolve. Vite 7 introduces a new buildApp hook to help plugins better coordinate environment setup during build processes. While still in review, the API is already showing potential in runtime-specific tooling and integrations. Developers are encouraged to test it out and provide feedback as it moves toward stabilization.

  • For testing setups, Vite 7 is fully compatible with the latest versions of Vitest. The update also removes deprecated features like the Sass legacy API and splitVendorChunkPlugin, aiming to keep the codebase lean while preserving backward compatibility for most projects.

  • Upgrading to Vite 7 should be relatively straightforward for anyone already on Vite 6. Migration guides are available to help smooth the process, and the complete changelog documents all updates in detail.


V8 Gets Smarter: Speculative Inlining & Deoptimizations

V8 team has rolled out two major performance upgrades for WebAssembly: speculative inlining and deoptimization support — now shipping in Chrome M137. Together, they significantly improve execution speed, especially for apps compiled from higher-level languages like Dart, Java, or Kotlin using the WasmGC model.

  • Previously, inlining was limited to direct or well-known function calls. But with call_indirect — where the callee is only determined at runtime — inlining was tricky. Now, V8 uses runtime profiling to identify likely targets at hot call sites and speculatively inlines them. If a function is called frequently via call_indirect, V8 tracks the target using feedback vectors and replaces the call with inlined code for up to four targets. This reduces call overhead and opens the door for further compiler optimizations like constant folding and dead code elimination.

  • Speculative optimizations rely on assumptions — and when those assumptions break, V8 needs a safe fallback. That’s where deopts come in. If the inlined assumption is wrong (for example, a different target function is called), V8 exits the optimized path and jumps back into baseline (unoptimized) code without breaking execution. This new deopt support for WebAssembly mirrors what’s long been used in JavaScript JITs and is now a key foundation for even deeper optimizations in future WasmGC workloads.

  • Combined, these changes show big wins. On Dart microbenchmarks, V8 saw over 50% speedups, and real-world WasmGC apps (like databases and UI engines) saw 1 to 8% performance boosts. More importantly, this brings WebAssembly performance optimization closer to what we've long had with JavaScript — making Wasm a truly competitive target for modern web and app development.


Tools & Releases You Should Know About

Transformers.js Adds Gemma 3n, Qwen3 Embeddings, and Llava-Qwen2

Transformers.js now supports several new models across NLP, vision, audio, and multimodal tasks. The biggest update is Gemma 3n — a local-first model built for multimodal inputs (text, images, audio, video), optimized to run efficiently with only a 2B footprint despite its 6B parameter design. It also introduces MatFormer, allowing model nesting and custom sub-models via mix-and-match. The Qwen3 Embedding models are also live, purpose-built for dense, multilingual embeddings and reranking. Llava-Qwen2 support is in progress, pairing vision with Qwen2’s language backbone.

zx 8.6: Write Shell Scripts in JavaScript

zx simplifies writing shell scripts using JavaScript, offering a cleaner alternative to bash. Version 8.6 brings improved defaults, cross-platform behavior, and less boilerplate for scripting tasks. Perfect for developers who want to script automation without leaving the comfort of Node.js.

Node.js 22.17.0 LTS Brings Clarity to APIs

The latest Node LTS version discourages instantiating core HTTP classes like IncomingMessage without new, aligning with standard JavaScript expectations. Setting options.shell to an empty string now warns, urging explicit configurations. HTTP/2’s prioritization API is deprecated due to poor ecosystem support. Also stabilized: assert.partialDeepStrictEqual(), a handy utility for partial object testing.

SVGO Continues to Optimize the Web

SVGO remains a trusted tool for minimizing SVG file sizes without affecting rendering. Whether via CLI or Node.js API, SVGO strips away unnecessary metadata and default values. It's widely integrated into tools like PostCSS, webpack, and Docusaurus, helping teams deliver lighter and faster web experiences by default.


And that's it for the forty-first issue of "This Week in JavaScript", brought to you by jam.dev—the tool that makes it impossible for your team to send you bad bug reports.

Feel free to share this newsletter with a fellow developer, and make sure you're following for more weekly updates.

Until next time, happy coding!

Top comments (0)