No-Refresh Nav
When I started putting this site together, I wanted to put an audio player on the page—but I realized after like, a hot second that if people were clicking around to all the pages, that'd mean the audio player would reload every time so they probably wouldn't even get through one song.
There's a couple solutions to this. A lot of people have an iframe-based layout on Neocities, which means that they can load content into a frame and never navigate away from the same page; however, this means that you can't link people to a specific page as easily without doing some url query stuff that ends up looking kind of messy. Therefore: a bit of sleight of hand using asynchronous loading in combination with the History API.
First of all, I set up a regular expression that matches URLs against the current site's host to determine if it's an internal link, and then also create a dummy HTML element to shove the response into for DOM parsing.
Then, I create an asynchronous function that prevents navigation if it's an internal link, and then uses the javascript Fetch API to get the content of the referenced page. I load that into the dummy
element so that I can query it for the main
tag and grab the contents of that to replace the current contents of the main
tag on the page as-is.
Also, if there are any script elements within the main
tag, I snip them out and apply them to the page by appending them to the body to make sure they run.
const siteRegex = new RegExp(`^${window.location.protocol}//${window.location.host}`);
let dummy = document.createElement('html');
const getPage = async (u, e) => {
const url = new URL(u);
if (`${url.protocol}//${url.host}`.match(siteRegex)) {
if (e.type === 'click') {
e.preventDefault();
}
const request = await fetch(url.href);
dummy.innerHTML = await request.text();
document.querySelector('main').innerHTML = dummy.querySelector('main').innerHTML;
dummy.querySelectorAll('main script').forEach((item) => {
const resource = item.getAttribute('src');
const newScript = document.createElement('script');
if (resource) {
newScript.setAttribute('src', resource);
}
newScript.setAttribute('type', item.getAttribute('type'));
newScript.innerHTML = item.innerHTML;
document.querySelector('body').append(newScript);
item.remove();
});
}
}
Okay, now that I've got that, it's time to actually, you know, do the routing, because on its own the above does nothing!
I create a function called routingInit
that runs a click event listener that checks whether the target element is a link that doesn't have the target "_blank", and then sends it to getPage
, the function I made above.
Then we do something a little interesting, which is push a new state to window.history
, giving it the title and the url of the page we pulled in. Then we also need to listen for the popstate
event, which is when the user is navigating through their page history, and getting the url of the new state and loading that in. This is why we pass the event to the getPage
function—since we're using it in both the click
and popstate
handlers, we need to know what's triggering it.
const routingInit = async () => {
document.addEventListener('click', async (e) => {
const link = e.target.closest('a');
const url = new URL(link.href);
if (link.target !== '_blank') {
getPage(url, e);
window.history.pushState( {} , dummy.querySelector('title').text, url.pathname);
}
});
window.addEventListener('popstate', async (e) => {
const url = new URL(document.location);
getPage(url, e);
});
}
The handy part is that if someone navigates to the url we changed the location to by entering it in their URL bar—well, we have the page there, because it's a file that exists! So it's crawlable via search.
The other thing that's good to do is to add the aria-live="assertive"
attribute to your main
tag (or whatever element you use to hold your content). This tells screenreaders to watch the element for updates and to read them out. (You can also set this to "polite" or "rude" depending on how much you want the new content to supercede what's currently being read out.)