This release includes WASM code-splitting/lazy-loading support, in tandem with the latest cargo-leptos
release.
You can use the lazy_routes
example to understand what this means!
Essentially, though, there are two patterns:
- Use the
#[lazy]
macro to make any given function lazy - Use the
#[lazy_route]
to designate a route with a lazy-loaded view, which is loaded concurrently with the route's data
#[lazy]
converts a (sync or async) function into a lazy-loaded async function
#[lazy]
fn deserialize_comments(data: &str) -> Vec<Comment> {
serde_json::from_str(data).unwrap()
}
#[lazy_route]
lets you split routes into a "data" half and a "view" half, which will be concurrently loaded by the router. This works with nested routing: so if you have ViewD and ViewE, then the router will concurrently load D's data, D's (lazy) view, E's data, and E's (lazy) view, before navigating to the page.
struct ViewD {
data: Resource<Result<Vec<i32>, ServerFnError>>,
}
#[lazy_route]
impl LazyRoute for ViewD {
fn data() -> Self {
Self {
data: Resource::new(|| (), |_| d_data()),
}
}
fn view(this: Self) -> AnyView {
let items = move || {
Suspend::new(async move {
this.data
.await
.unwrap_or_default()
.into_iter()
.map(|item| view! { <li>{item}</li> })
.collect::<Vec<_>>()
})
};
view! {
<p id="page">"View D"</p>
<hr/>
<Suspense fallback=|| view! { <p id="loading">"Loading..."</p> }>
<ul>{items}</ul>
</Suspense>
<Outlet/>
}
.into_any()
}
}
#[server]
async fn d_data() -> Result<Vec<i32>, ServerFnError> {
tokio::time::sleep(std::time::Duration::from_millis(250)).await;
Ok(vec![1, 1, 2, 3, 5, 8, 13])
}
Our whole July stream was dedicated to the topic, if you want more in depth discussion.
What's Changed
- Preparing to publish oco_ref 0.2.1 by @martinfrances107 in #4168
- feat: wasm-splitting library support for future cargo-leptos integration by @gbj in #3988
Full Changelog: v0.8.4...v0.8.5