Re-Architecting a Firebase app to work with Node.js and MongoDB

Tweet about this on TwitterShare on LinkedIn9Share on Google+9Share on Reddit0Buffer this pageFlattr the authorEmail this to someonePrint this page

In the last few days, I have see people worrying about two major problems. One of which is what is going to happen when Angular 2.0 releases and next one is how to migrate an existing Firebase code to a Node.js/MongoDB platform.

I am not going to answer problem one, as I don’t know what is going to happen (ahem chaos) but I will try and propose an experimental solution to the second issue on how you can re-architecting a Firebase app to work with Node.js and MongoDB.

I have met a few people who used Firebase to develop their product and hit the market in a short span of time. Now that the product is doing good, they want to stabilize it further and add more features. And they have realized that a few things may not be that easy to achieve using Firebase.

So they have built another layer with Node.js and MongoDB which kind-of replicates the data and performs actions on it. Another solution I have seen is to use apibase – a Node.js module to communicate with Firebase. I was not really satisfied with the solutions.

So, I went back and asked why Firebase and why are you fitting Firebase in your application architecture? Here are 2 major answers

  • Real time data sync
  • Offline storage

After giving the above reasons a good thought, I have come up with the below architecture that will “emulate” Firebase using Websockets and Local Storage inside a Node.js/MongoD.

If this solution gets stable over a period of time, I may create a yeoman/slush generator to scaffold a Node.js/Express.js and a MongoDB app with real time data sync and offline storage capabilities.

For this post, I will use a simple Todo app as an example. You can find the completed code here.

So let us get architecting!

Firebase App

If you are reading this post, I am assuming that you know what Firebase is and how to work with it. If not refer Getting started with Firebase.


Now, you have an app that performs CRUD operations using the Firebase and you want to port the same app to Node.js and MongoDB.

Below is an experimental architecture to achieve the same

fb_rearchOn the server side, we have Express.js that will take care of traditional requests like dispatching pages or exposing HTTP REST API endpoints. And we have a Websocket running on the same port that will take care of real time sync endpoints.

So, the above server side can act as both a traditional server, where you can use HTTP to login users or show static pages as well as a web socket endpoint, which you can use only for real time sync features like chat or timeline feed etc.

We will implement the above server architecture in this post, so that you can get an idea how things work. We will also see how you can integrate MongoDB with Websockets to fetch and update data real time.

This sorts the real time sync scenario. Coming to offline storage. We will have a simple wrapper around the Websockets client. This wrapper will check for the connection. If the connection is alive, the data will be pushed to the server, else we will push the data to Local storage.

And once the connection is made again, we will push the data to the server keeping all the layers in sync.

Developing the Proposed Solution

Now that we have an idea as what needs to be done, let us build the same.

First, create a new folder named todoApp and open a new terminal/prompt here. We will init a new Node.js project. Run

npm init

Fill it up as applicable. Next, we will install the dependencies. Execute

npm install express mongojs ejs cookie-parser body-parser --save

The final package.json would look like

Next, we will create the Express server. Create a new file named server.js at the root of the todoApp folder and update as below

Things to notice

Line 1-4 : Require modules

Line 6 : Init a new Express app.

Line 8 : Create a new HTTP server instance from the Express app

Line 9 : Initialize the socket server passing in the HTTP server instance as a param. This step is where we configure all the real time end points for your live feed or chat service. (we will add this soon)

Line 12-21 : Standard Express config

Line 23 : Include all the Express routes

Line 32 : Start the server

Next, we will define an Express route to dispatch the home page. Create a new folder named routes. Inside this folder, create a new file named index.js. Update it as below

Next, we will be working with the sockets. We are using to manage our server and client side web sockets.

In a traditional REST API, we have endpoints like

As you can see from above, each REST endpoint has a unique URI. We are going to replicate the same in Websockets using a concept called as Namespaces.

Socket.IO allows you to “namespace” your sockets, which essentially means assigning different endpoints or paths.

This is a useful feature to minimize the number of resources (TCP connections) and at the same time separate concerns within your application by introducing separation between communication channels.

So, this way we have only one socket open and at the same time have unique endpoints.

To implement this, create a new folder named sockets at the root of the project. Create another file named index.js inside this folder and update it as below

On line 2, we create a new instance of the socket. And on line 8, we register other web sockets, which we are going to add next. On line 4, we have the callback for the first connection from the client. You can use this to perform validations etc.

Create a new file named inside the sockets folder and update it as below

Quite a few things happening here. We have created 4 “endpoints” that will act like GET, POST, PUT and DELETE.

Line 1 : We require the DB API layer. This layer will talk to MongoDB (we will create this next).

Line 4 : We define a new namespace

Line 6 : Gets triggered when a connection to this namespace is made by a client.

Line 8 : When the socket receives a  getAllTodos  event, it calls the  dispatchAll() on line 39. This inturn calls the DB and returns the response to the client.

Line 12 : When the socket receives a  saveTodo  event, it sends the received data to the DB layer. And upon successful insertion, it calls the  dispatchAll() triggering a DB call to get the latest set of todos and emit the same to the client.

Line 19 : Works as above, but the DB layer updates the record instead of insertion

Line 26 : Invokes the delete method in the DB layer.

A simple emulation of the REST API using Web sockets. Now, we will create the DB layer.

Note : You can copy the above file as a template and update namespace and methods for new endpoints. 

Create a new file name todos.db.js inside the sockets folder and update it as below

Standard DB interaction methods. I have used MongoLab for this example and not a local instance. You can use any as per your preference and need.

This wraps up our server side implementation. We will move to the client next. Create a new folder named public at the root of the project. This file will host all our clientside assets.

For this project we will use jQuery for DOM manipulation and a DataManager wrapper, which takes care of client side data management.

Inside the public folder, create a new file named DataManager.js. Update it as below

DataManager is the wrapper around sockets, that takes cares of managing the data on your behalf. This file is kind of like the firebase.js you would include in your HTML, but a lot more raw and experimental.

When dealing with sockets, there are 2 types of events. One the publisher, that sends the data to the server and one subscriber that receives the data when some other publisher updates the data. We will be classifying all our endpoints into either publishers or subscribers and calling them differently.

DataManager class takes in the below options

  • connection – The base socket endpoint
  • connectCB – The callback to be fired when the connection/reconnection is made
  • collections – A list of collections/end points/namespaces and their subscriber methods with callback.

The last options takes care of registering the event on that namespace and calling the callback when the event is triggered on that socket. We will take a look at this in a moment.

When you want to save the data to your server, you need to call  pubData() on line 41. This will take care of checking the connection and managing the data accordingly.

Finally our view. Create a new folder named views at the root of the project. Create a new file named index.html inside it and update it as below

Things to notice

Line 15 : Placeholder to populate all todos

Line 18 : Refer jQuery source

Line 19 : Refer socket client (will be created dynamically)

Line 20 : Refer the DataManager wrapper

Line 23 : Build the options object

Line 34 : Create a new instance of the DataManager class passing in the options

Line 40 : Will be triggered when the server emits getAllTodos event.

Line 55, 71 and 81 : When we want to save/update/delete a todo, we call the  pubData() and pass in the namespace, endpoint name, data and a callback. The callback will be called notifying whether the data was saved locally or saved to the server, so you can update the UI accordingly.

Simple and easy! Well not so much. But this should get both the real time sync and offline storage going on.

Save all the files and run

node server.js

Navigate to  http://localhost:2000 to see the app. Add a few todos to check the setup. Next kill the server and try updating the data. In the above example, the data is just pushed to the local storage and the UI is not updated. You can check if the data is saved to local and manually update it and probably have a background red indication that you have saved the changes locally.

When the server reconnects or when you restart the server, you can see that all the data in local storage will be pushed to the server and finally the UI is updated.

Hope this post gave you a possible solution on how you can achieve a real time sync and implement a offline storage for the same.

If you have tried this solution or want to share your solution feel free to comment.

Thanks for reading! Do comment.

Tweet about this on TwitterShare on LinkedIn9Share on Google+9Share on Reddit0Buffer this pageFlattr the authorEmail this to someonePrint this page
  • Franklin Rincon

    Thanks to share Arvin, you’re a crack!

  • William Vega

    Can this be done using loopback.js and strongloop arc?

  • Getulio Romão C. Jr

    You could just go with loopback since your backend needs most an API and you would have offline sync and real time.

  • rory cawley

    love this

  • wbarahona

    This is simple and sweet, I followed the steps et voila it works, now I have a foundation on how to build a real time app, and detach from firebase sir. Many kudos for you from Honduras :)

    • Arvind Ravulavaru

      Thanks wbarahona. But do keep in mind that this setup is experimental. I have not used this setup in a live project, so use with caution.

      Also, feel free report any issues and bottlenecks with this setup. We can probably enhance it.


      • wbarahona

        Yes, I will keep in mind that, I just wanted a functional example to expand… I will report back any issues, thanks.

  • fares

    how would you deal with authentication
    thank you for this tutorial. it helped me a lot!

    • Arvind Ravulavaru

      Thanks fares. You mean the initial login to the app when the network is down? or How do we maintain a session on the client side?

      • fares

        i mean maintaining the session with jwt token / cookie.

        • Arvind Ravulavaru

          You can check out this post : It shows how to manage jwt on the client side as well as creation on server side.

          Let me know if you are looking for some other solution. Thanks.

          • fares

            Thanks, that is what I was looking for.
            And it works.

          • Arvind Ravulavaru

            Great! Thanks.

  • Michel

    You should take a look at rethinkdb. It would make things simpler/nicer.

    • Arvind Ravulavaru

      Thanks Michel. I did look at rethinkdb a few day ago. But does it have a client side counterpart?

  • kevin jose

    can you implement this datamanger class in angularjs ?

    • Arvind Ravulavaru

      I think you can. You can create a service that will emulate the Data manager class and you can have another service that interacts with sockets. And the data manager service can check before making a call to the REST endpoint if the network is available and deal with it accordingly.

      • kevin jose

        Thanks arvind .
        How will the service know as to which data is the recent one. Do i need to out a timestamp field in my data model and check for the latest one each time or do i have any other method . i haven’t used as of now

        • Arvind Ravulavaru

          So, In that case, I would recommend implementing the above javascript version and then trying the angular version on after that. If you still have issues do let me know.

          • kevin jose

            Thanks a lot :)

          • Arvind Ravulavaru

            Also checkout to work with websockets from Angular js

          • kevin jose

            Thanks Arvind .. also i found somthing called .. it is built using Express.js here is the link to the webpage i found
            i will be following this method .. please suggest the cons if any .. that will be great.. i am quiet new to the whole javascript stacks .

          • Arvind Ravulavaru

            Interesting solution. The first impression is good. Let me try and use this in some project. Then may be I can comment. Thanks for sharing!