The web is becoming more and more complex everyday. User experience and process flow are at the forefront of many front end dev’s mindsets when designing apps and features on webpages. One of the best ways to improve user experience is to have things work well and work fast. Users hate waiting. Fact.
One super easy way to make sure users aren’t waiting is to only load what they want to see. While that’s simple enough to do from a user experience perspective, what about the architecture that makes that happen? What if we could only load one page and have everything update in real time? Luckily, we can do exactly that. An SPA (Single Page Application) achieves this perfectly.
As the name suggests, an SPA is an application that loads only one page. Easy right? While it seems very simple on the surface, there is a lot that goes into an SPA to make it work the way it does. JS front end frameworks like React, Angular, and Vue.JS take the thought process off of the developer and allow you to make use of awesome libraries and functionality to achieve this. But, I wanted to understand this better and in a bid to do so, decided to build the architecture myself for a project. So what actually goes into an SPA? Well, let’s take a look.
The first, very obvious thing is that only one page should be loaded when the website is first visited. In this project, I could have limited myself to using the built in XMLHTTP request object for all browsers. As I valued my sanity and my time, I decided to use Express.JS to make requests and handle routing. So, let’s take a look at the lines in the server file that make the magic happen.
One thing to note immediately is my use of path. I’m using path here to resolve path names from my project root easier. The first line to start everything off is app.use. This allows me to set my public folder using express.static to the folder in my front end directory. This on its own achieves some of what I’m looking for, as only assets from this folder will be loaded on the page. However, the real magic is in the app.get line. This line forces all page loads to return index.html from the public folder. Satisfying the single page loaded. No other page will be loaded as the same html file will be sent for all requests made from the root. Clever, right?
The next piece of the puzzle is handling switching content. To start explaining this, let’s take a look at the HTML file.
If you’re familiar with React, you’ll see a very similar thing in this mark up immediately. The div tag with the app id. What made the most sense to me in achieving the SPA aspects other frameworks showcase was to imitate them directly. All utilise JS to update the DOM while on the page. On the surface, that seems simple. But, the decisions that go into containerising and modularising the mark up you want to add, as well as the process of actually adding it can vary greatly. The best way to step through my approach to this scenario is to step through the router function I wrote.
Seems weird right? Having roughly 20 lines of code handle how all of the website handles requests. Each piece of this function is handling an integral part of the routing process. Without one of these pieces, the architecture would fail to take off. So let’s step through.
Firstly, the use of async here is to manage how the view will be resolved. Mark up is injected into the DOM via an object that is created later on in the function. So we need to carry on with the rest of the operations in the function in order to actually resolve this mark up successfully.
Next, the routes to be navigated to are instantiated. This contains a name, path, that will be matched on and a class that will be used to create an object later on in the function.
Now we have the ability to match our URL on a set of potential paths, we should do that. The potential matches constant returns an object that contains exactly that. We then use this object to find the matching route and assign this to the match variable.
But, what if there is no match found? To ensure we don’t just get stuck, I have the function assign a default match object to the home route. This ternary is a powerful line and prevents any users malforming URLs to get to places that don’t exist. If it is not a route defined in the routes constant, you can’t navigate to it and break the app.
The next 2 lines are how we get our HTML mark up into the DOM. The view is the object created from the class that contains our mark up. The mark up is the return value of the method getHtml, which is then added to the DOM via a querySelector targeting the div with the id of app. Then, we’re done. No page loads. Just the markup being loaded to the DOM. This router function is called in a few places.
The navTo function is there to abstract the navigation to the URL without having the page load. By not passing a state or title to the history.pushState call, we’re able to change the URL without the page loading, then calling the router function to render the DOM content.
The 2nd is where the SPA behavior is tied together. Once the DOM content has loaded, an event listener is added to all items that have the data-link customer property that then allows us to trigger the router function upon the URL changing. The navTo function is then called to prevent the page load and the new DOM content overwrites the old.
In carrying out this learning exercise, I learned a huge amount about websites, front end, and back end architecture. I now really appreciate how awesome React and other JS frameworks truly are for being able to handle all of this logic. One problem I ran into was malformed URLs causing the app to break. That is, content in a new view would not load even if the URL matched a defined route. A one line hack I came up with to get around this was this.
By making sure the window object actually existed (I know, very silly if I’m already in my browser), I can trigger a call to the router function on popstate. This way, I can make sure I am always on a valid path or on the home page if no route is matched on.
And that’s it. I’ve learned what makes an SPA tick and how to truly leverage parts of the browser API to gain real control of what happens for my user’s experience.