Skip to content

[Expression of support]: Support from WebLLM, an in-Browser LLM inference engine #31

@CharlieFRuan

Description

@CharlieFRuan

What are the primary cross-origin storage use cases in your apps?

  • AI models
  • SQLite databases
  • Offline storage archives
  • Wasm modules
  • Other

Would you use this API in production as a progressive enhancement?

Yes

Why is cross-origin storage important for your use case?

WebLLM users need to download large LLM weights and WASM modules (which include artifacts like compiled WebGPU kernels) for each AI model they intend to use. Prior to the introduction of COS, the "same-origin policy" meant that each website or extension, even if using the same model, had to cache its own copy of the model.

With careful considerations for privacy and security, the COS initiative could address this inefficiency by eliminating the overhead of downloading duplicate large models, saving both storage space (avoiding redundant caches) and time (preventing repeated downloads), and improving the experience of end users.

What company do you work for?

MLC community

Concerns or reservations

No response

Additional features or improvements

No response

Any other feedback

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions