Build & Publish a Node Package Module (NPM)

In this post, we are going to build and publish a Node Package Module (NPM). The module we are going to build is a simple utility which will take a  Web supported color name as an argument and spits out the HEX & RGB values for the same. We are going to use a Test Driven Development (TDD) style of coding for building this module.

I have taken the color names from htmlcolorcodes.com – Thanks to them!

Before we get started, here is a quick preview of the CLI version in action

The story behind rebeccapurple. RIP Rebecca Meyer.

Getting Started

Below are the steps we are going to follow

  1. Setup Yeoman
  2. Install Generator
  3. TDD Colorcode
  4. Validate Code Coverage
  5. Push to Github
  6. Setup Travis CI
  7. Setup Coverall
  8. Publish to NPM

Setup Yeoman

If you are new to Yeoman, Yeoman is

What is Yeoman?

Ripping off from yeoman.io

Yeoman helps you to kickstart new projects, prescribing best practices and tools to help you stay productive.

To do so, we provide a generator ecosystem. A generator is basically a plugin that can be run with the yo command to scaffold complete projects or useful parts.

Through our official Generators, we promote the “Yeoman workflow”. This workflow is a robust and opinionated client-side stack, comprising tools and frameworks that can help developers quickly build beautiful web applications. We take care of providing everything needed to get started without any of the normal headaches associated with a manual setup.

With a modular architecture that can scale out of the box, we leverage the success and lessons learned from several open-source communities to ensure the stack developers use is as intelligent as possible.

As firm believers in good documentation and well thought out build processes, Yeoman includes support for linting, testing, minification and much more, so developers can focus on solutions rather than worrying about the little things.

So ya, Yeoman is like your File > New > NPM project | fill the project information and boom! A NPM project gets generated; but from command line.

To install Yeoman globally, run

And to validate, run

Install Node.js

Before we go further, make sure you have Node.js installed on your machine. You can follow this post Hello Node for same. I am using the following versions

generator-np

In this post, we are going to use a Yeoman generator named generator-np to scaffold our base for building the NPM. To install the generator globally, run

Once the installation is done, we are good to scaffold our app

Scaffold color2code

Anywhere on your machine, create a folder named color2code  and open a new terminal/prompt there. Now, run

And you can answer the questions as shown below

This will take a moment to complete scaffolding.

Once the project scaffolded, we should see something like

In this project, we are going to follow a Test Driven Development approach. We will first update the  test/index.js  file with all possible scenarios and then get started with the code.

Next, we are going to work in src  folder to update the cli.js  and the  lib/index.js  files.

Before we get started, let’s quickly look at the solution.

Color2Code Solution

The app we are going to build is a simple utility that takes a popular web supported color names and spits out the RGB and HEX values of the same.

We are going to maintain a collection of all the web supported colors and their HEX values. And in our code, we are going to read color name and fetch the HEX value. We then use the HEX value to generate the RGB version of the same.

Sounds simple right?

So, let’s see how to work with this solution & build a NPM.

Update project configuration

I have made a few changes to the project so that things are a bit more easy to manage.

Open  .eslintrc  and make the changes to the properties shown below

Next, we are going to update  package.json  as highlighted below

Finally, update  .travis.yml as shown below

That is all!

Test Driven Development – TDD

To work with the code base we have just scaffolded, we are going to use TDD or Test Driven Development. We are going to write the test cases first and then develop our code.

TDD is not for the weak hearted. Seeing your application fail even before writing a single line of code takes courage.

You can read more about TDD here: Test Driven Development by Example

Write Test Cases

To get started, we are going to write the test cases first and see our app fail and then start making the required changes to our code to fix it.

Open  test/index.js and update it as shown below

The above file outlines 5 scenarios

  1. If there is no color name provided to our NPM
  2. If one color name is provided to our NPM
  3. If more than one color names are provided to our NPM
  4. If an invalid color name is provided to our NPM
  5. If an invalid color name is provided along with valid color names to our NPM

The code we are writing should satisfy the above 5 scenarios.

Now, let’s run the NPM and see it fail.

Run the NPM

We are going to use nodemon to watch our app for changes and rebuild our application.

To install nodemon, run

Once nodemon is installed, from inside color2code  folder, run

If the command ran successfully, we should see an error blob stating that tests have failed.

Yay! we did it.

Now, let’s get cracking on fixing these cases.

PS: Keep the nodemon running in the terminal/prompt. As we keep making changes and saving files, the tests will be executed again.

Develop the NPM

First, we are going to get a dump of Web colors. Thanks to htmlcolorcodes.com, we have list of 148 color names.

Create a file named  colors.json inside  src/lib folder and update it as shown below

Save the file and you should see that nodemon  has detected changes to the src  folder and is running the prepare  task again. The tests are still failing.

Next, inside  src/lib folder, create a file named  utils.js and update it as shown below

On Line 1, we have required the colors hash.  hexToRGB() takes a HEX code and returns the RGB hash for the same.  processColor() takes in the color name and returns the final result.

Save the file and still the tests do not pass.

Awesome! let’s continue

Update  src/lib/index.js as shown below

Here we are processing the options and responding accordingly.

Save the file and Boom!

All the tests pass! And our NPM works!

Now, we will udpate  src/cli.js as follows

This concludes our development.

Simulate NPM install & Test

If you would like to simulate an NPM global install and test your app before publishing, you can do so by running the below command from inside the  color2code folder

And we should see something like

And from anywhere on your machine you can run

Awesome right!

Code Coverage

We have written code and we have written test cases. Now, let’s take a look at the code coverage.

From inside  color2code folder, run

And we should see something like

You can also open  coverage/index.html to view the report in the browser. It should look something like this

So we are good to move our code to Github.

Push color2code to Github

You can push your fresh NPM to git to maintain it and let other people look-see/contribute to your code. For that we will follow the below steps

  1. Create a new Github repo: https://github.com/new
  2. Run the following commands

PS: Update your repo URL as applicable

Voila! The code is deployed to Github. You can checkout my code here: arvindr21/color2code

Now that this is done, let’s setup Travis CI and Coveralls.

Setup Travis CI

Now that the repo is setup in Github, we will have a CI system in place. A Continuous Integration System keeps a tab on your repo for new changes and runs test cases on new changes and let’s you know if anything is broken.

Travis CI is a popular open source solution. You can read more about Travis CI here. We are going to use the same for managing our NPM.

If you open your Github repo, at the top of your readme, you should find a section as below

Click on “build | unknown” and you should be taken to Travis CI. In Travis CI, you should see a page like below

Activate repository.

From next time onwards when ever there is a push to the repo, the CI will kick in and check if the build has passed.

Before we trigger a build, we will setup Coveralls.

Setup coveralls

Quoting from coveralls.io/continuous-integration

Coveralls takes the build data from whichever CI service your project uses, parses it, and provides constant updates and statistics on your projects’ code coverage to show you how coverage has changed with the new build, and what isn’t covered by tests. Coveralls even breaks down the test coverage on a file by file basis. You can see the relevant coverage, covered and missed lines, and the hits per line for each file, as well as quickly browse through individuals files that have changed in a new commit, and see exactly what changed in the build coverage.

Back to our repo and at the top of readme, click on “coverage | unknown” and you should be redirected to coveralls.io. Sign in and Click on the “+” sign in the menu on left hand side and search your repo here. Once you find it, enable coveralls for this repo.

And that is all, we are done.

Trigger Build

Now, let’s go back to Travis and trigger build from “More options” menu on the right hand side. And you should be shown a popup, fill it as applicable – trigger custom build

And once the build is completed, you should see something like this

Sweet right! Our NPM works from versions 4 to 10. Yay!!

This trigger will happen automagically when ever there is a new commit/Pull request.

The final step if build passes is to update coveralls. If we navigate to coveralls for our repo, we should see something like

Travis arvindr21/color2code

We have a code coverage of 90%, which is good, considering we have only written few lines code.

We can also find out which files has what amount of coverage, as shown below

You can drill down further to analyse the issues.

With this we are done with our CI and Code coverage setup. If we go back to our Repo’s readme file, we should now see

Now, we are going to publish the color2code to npmjs.com.

Publish color2code to NPM

To push color2code to NPM, we need to have an account with npmjs.com. Once you have signed up and activated your account, you can login to the same from your command prompt/terminal.

Run

Once logged in, make sure you are at the root of  color2code folder and run

And boom your color2code  NPM is published.

If you get an error, which mostly will happen because you have used the same name as mine – color2code. You need to use another name for your module in package.json  file and then try publishing.

Your NPM is published. Navigate to npmjs.com/package/color2code to see the module we have just built.

With this our repo badges would look like

Hope you have learned how to build, publish and manage your own NPM!


Thanks for reading! Do comment.
@arvindr21

Deploying a Node.js Docker container to Heroku

In the last post titled Developing Node.js applications in Docker, we have seen how we can work with Docker as our development environment.

In this post, we are going to take the app that we have built named docker-node-app and deploy it to Heroku as a Docker container.

The sample code used in this post can be found here: docker-node-app.

So, let’s get started.

Setup Heroku

The first thing we need to do is download and install Heroku tool belt on our local machine.

Download Toolbelt

You can follow the instructions given here: Install Heroku Toolbelt to install Heroku tool belt for your Operating System.

Setup Account

If you do not have a Heroku account, you can signup here.

Login

Now that we have Heroku installed, we will login. Open a new command prompt/terminal and run

Next, we need to login to the container registry. Run

Once we have successfully logged in to Heroku & Heroku container registry, let’s work with Docker.

Setup Docker

The first thing we are going to do is download Docker for our Operating System.

Download

To Install Docker on our machine, navigate to here & download the software for your OS.

Test Download

Once Installed, you can open up a command prompt/terminal and run  docker ps and you should see something like

At this point, we have both Docker as well as Heroku installed on our machine.

Next, we are going to containerize our app and deploy it to Heroku.

Setup docker-node-app

If you have not already done, please go through Developing Node.js applications in Docker, where we see how to develop Node.js applications using Docker. We are going to begin at the ending of that post.

Clone docker-node-app

Now, we are going to clone docker-node-app from https://github.com/arvindr21/docker-node-app. From anywhere on your machine, run

And then  cd docker-node-app .

Fix Port

We need to make a change to the Dockerfile  to fix the port. While deploying to Heroku, port would be assigned by Heroku.

Open Dockerfile  and update it as shown below

That is it, we are all set to deploy our awesome container app to Heroku.

Deploy to Heroku

Now, we are going to deploy to Heroku. We are going to follow the below steps

  1. Heroku create new app (one time)
  2. Heroku container push – Create and push the container to Heroku Container Registry
  3. Heroku container release – Release/run the container app
  4. Heroku open – Launch the app

From inside  docker-node-app run

This will create a new Heroku app.

Next, we will push the container

Now, we will run the container

And finally open the deployed app run

And we should see

Voila! Our Node.js that we have containerized while development is now deployed to Heroku with ease.

Development & Deployment simplified!

PS: I am shutting down the above Heroku app, as it does not have any thing specific to showcase.

Hope you got an idea as to how we can deploy Node.js containerized application to Heroku.


Thanks for reading! Do comment.
@arvindr21

Developing Node.js applications in Docker

In this post, we are going to look at developing a Node.js application using Docker as the development environment.

Why Dockerize my development?

You may ask, what is wrong with my current setup? Why should I “Dockerize my Node app”?

Well for starters, your development environment need not be the machine on which you are coding.

Let’s say that you are new to a project and you are getting started with your environment setup and you are installing an awesome plugin like grunt-contrib-imagemin and you find that its dependency libpng is missing on your machine. You Google and try to solve the problem yourself because a. you are the new member in the team and you want to prove yourself and b. you don’t know anyone in the team yet.

Everything is good so far, you were able to fix the libpng issue and now you run npm install  again and you notice that grunt-yellowlabtools needs phantomjs and for some reason the download fails while installation. Again for the above said reasons, you are too shy to approach someone so you spend the first day on the project installing the dependencies and finally setting up your environment and running your project successfully.

Very productive day one.

Now, imagine, one your first day, all you would need to do is

  1. Install Docker
  2. Run docker run -it -p 3000:3000 myproject/my-app:0.1.0

And voila, the app is running on your machine. And you are all set to get started. How about that?

4 reasons why?

Here are a few reasons why you need a “Dockerized” development environment:

  1. Your production environment will almost always never be same as your development environment. So you can directly work on your production environment setup on your local to avoid “surprising” deployment issues.
  2. Easy for your devops team to work and scale with containers in higher environments than your local
  3. One developer machine can run multiple (Node.js) applications in an isolated way
  4. Docker compose lets us run dependent pieces of softwarer in an micro service architecture easily

Now that you feel it is a good idea to Dockerize your development environment, lets look at how.

How to Dockerize a Node.js app?

We are going to follow below steps

  1. Setup Docker
  2. Build a simple Node.js application
  3. Build a Docker image
  4. Create a container from the Image and start developing.

Before we actually get started, let’s look at what is Docker.

What is Docker?

Docker is a computer program that performs operating-system-level virtualization also known as containerization.

To understand the above jibber-jabber, take a look at this video

Quite a simple and powerful concept.

If you would like to dig deeper in the world of containers, checkout this playlist from Vmware

Now that we understand the why, what and how; let’s get our hands dirty by building a Dockerized Node.js app.

Getting Started

First we are going to install Docker

Install Docker

For this tutorial, we are going to use Docker Community Edition (CE). To Install Docker on our machine, navigate to here & download the software for your OS.

Once Installed, you can open up a command prompt/terminal and run  docker ps and you should see something like

The above output shows that there are no images running.

Now, lets quickly test drive Docker.

Run  docker run busybox echo "hello from busybox"  and you should see that the busybox image downloaded locally and run the echo command that we have provided. The output would be as follows.

Now that we got the Docker installation done & verified, let’s move on.

Build a Node.js App

Now, we will setup a simple Node.js app. If you are planning to run this application on your local machine, make sure Node.js is setup on your machine. You can follow this post Hello Node. I am using the following versions

If you are going to run the app directly in a docker container, you can skip local installation of Node.js.

Anywhere on your machine, create a folder named  docker-node-app and open a new terminal/prompt there.

First, we are going to initialise a new Node.js project. Run

This will create a package.json  with default contents. Update it as shown below, as applicable

Do note that I have added a script named  start, which launches the app that we are going to create in a moment.

Now, we are going to build a simple app, which prints the environment variables on the machine it is running.

Inside docker-node-app folder, create a file named index.js  and update it as shown below

All we are doing is iterating the  process.env object & building a string with the key and value.

Test drive the Node.js application

Let’s test the application we have built. You can do this only if you have Node.js installed on your machine.

You can simply run

or

If you do not have nodemon on your machine, you can install it as follows

Once the server is up and running, navigate to http://localhost:3000/, we should see

Isni’t very brave of me to publish my machine’s environment variables out in public :/ 

Now that we have validated that the app is working fine, lets “containerize” it.

Build a Docker image

Now that we have our sample app running, let’s create an image. Inside docker-node-app  folder, create a file named Dockerfile and update it as shown below

On line 2 : we set the base image in which our app is going to run. Which is a Node image with version 8 installed.

On line 8: we create a directory where our source code would reside in the container

On line 11: we switch to the working directory

On line 15: we copy our current source code to the image

On line 18: we install nodemon globally.

On line 21: we install any other dependencies defined in  package.json

On line 24: we expose port 3000 so the host machine can access the app

On line 27: we start the Node.js app on boot of the container

Simple right?

Now to make sure we copy only what we need, we will create a file named  .dockerignore at the root of  docker-node-app and update it as shown below

Now that we are done with setup, we will build the docker image. From inside   docker-node-app  folder, run

docker-node-app docker build -t arvindr21/docker-node-app:0.1.0 .

Do note that  arvindr21 from the above command is my Dockerhub username. If you are planning to push this image to Dockerhub, it needs to be with your Dockerhub username.

Once the build kicks off, you should see logs as shown below

Now, if we run  docker-node-app docker images we should see

Awesome! Now our image is ready. We need to create a container from this image and voila our app  will be running from inside an image.

Create a conatiner & Run the app

This is the most important step in this process.

To run our Node.js app in a container, we are going to

  1. Download the code base on our local machine
  2. Point the volume of the container to the folder where the code is downloaded and run the image
  3. Open the local copy of code on your host machine text editor
  4. Start making changes

So, let’s move on. Since the codebase is already on our machine, we are not going to download it. If you project files are hosted on a remote server, download them to your local.

Next, we are going to create a container from the image and point the volume to docker-node-app  folder, as shown below

docker-node-app docker run -it -p 3000:3000 -v ${PWD}:/usr/src/app arvindr21/docker-node-app:0.1.0

/usr/src/app  : is the folder in the container

${PWD}  : is the folder on the host machine where the code is present. I am running the above command from the same folder, hence I am passing in the present working directory variable

3000 : is the port mapping between host machine and container.

Note: If you want to run this app as a background process pass a  -d or a deamon flag as shown below

docker-node-app docker run -itd -p 3000:3000 -v ${PWD}:/usr/src/app arvindr21/docker-node-app:0.1.0

If everything goes well, we should see

Now, navigate to http://localhost:3000/ and we should see something like

Do notice that the values now have updated to the environment variables from the container. Do notice the  HOSTNAME property, it is displaying a value of  cb0852847690. Now, from the host machine, run  docker ps and we should see something like

Do not the  HOSTNAME matches the  CONTAINER ID.

So, ya, it works.

Now to the awesome part, live reloading.

Live reload the container app

Now, we have our app running in a container and the code base on our host machine, let’s add a change to the code base on our local machine and see the changes reflect in the container app.

Open  index.js and update it as shown below

I have added a new piece of code to display the time.

Save the file on your host machine and we should see the following in the terminal/prompt

Now, back to http://localhost:3000/ and we should see

Voila! The time appear-th! Keep refreshing to see the value change.

Now, when you are done with the development, you can push the code to the remote repository and shutdown your Docker image. And once when you are ready to code, you can bring up the Docker image.

Adding code files

Okay, so we are able to make changes to an existing file, what about adding a new file. Let’s try that out.

Inside  docker-node-app folder, create a file called as  date.js and update it as shown below

All we have done is externalized the logic to get current date. Now, open  index.js and update it as shown below

And you should see a message in the prompt/terminal that the server restarted due to changes. Now if we go back to browser and refresh, we should see that the output does not change but we are loading the date from an external file.

Simple and powerful development environment that is development and deployment friendly.

You can find the code used in this post here: arvindr21/docker-node-app


Thanks for reading! Do comment.
@arvindr21

Electron, WordPress & Angular Material – An Offline Viewer

Imagine that you are at the airport, waiting for your flight and you hear a couple of guys sitting behind talking about Nodejs, Angularjs, MongoDB and you being a full stack developer and blogger start eves dropping. And all of sudden one guy mentions.. “Dude, you should totally check out this awesome blog.. It is titled ‘The Jackal of Javascript’ & that guy writes sweet stuff about node webkit, ionic framework and full stack application development with Javascript”. And the other guy says “cool..” And with his airport wifi connection starts browsing the blog on his laptop.. With in a few moments he loses interest because the pages aren’t loading.

What if the owner of the wordpress blog had an offline viewer and the first guy showed all the awesome stuff in the blog to the other guy on his laptop without any internet connection, wouldn’t that be cool?

So that is what we are going to do in this post. Build an offline viewer for a wordpress blog.

Below is a quick video introduction to the application we are going to build. 

 

Sweet right? 

So, let us stop imagining and let us build the offline viewer.

You can find the completed code here.

Architecture

As mentioned in the video, this is a POC for creating an offline viewer for a wordpress blog. We are using Electron (a.k.a Atom shell) to build the offline viewer. We will be using Angularjs in the form of Angular Material project to build the user interface for the application.

offlineviewer

As shown above, We will be using the request node module to make HTTP request to the wordpress API to download the blog posts in JSON format and persist it locally using DiskDB.

Once all the posts are downloaded, we will sandbox the content. The word sandbox here refers to containment of user interaction & experience inside the viewer. As in how we are capturing the content and replaying it back to the user. In our case, we are sandboxing the images and links, to control the user experience when the user is online or offline. This is better illustrated in the video above.

The JSON response from the WordPress API would look like : https://public-api.wordpress.com/rest/v1.1/sites/thejackalofjavascript.com/posts?page=1.

If you replace my blog domain name from the above URL with your WordPress blog, you should see the posts from your blog. We will be working with content property of the post object.

Note : We will be using recursions at a lot of places to process collections, asynchronously. I am still not sure if this is an ideal design pattern for an application like this.

Prerequisites

The offline viewer is a bit complicated in terms of number of components used. Before you proceed, I recommend taking a look at the following posts

Getting started

We will be using a yeoman generator named generator-electron to scaffold our base project, and then we will add the remaining components as we go along.

Create a new folder named offline_viewer and open terminal/prompt there.

To setup the generator, run the following

npm install yo grunt-cli bower generator-electron

This may take a few minutes to download all the once modules. Once this is done, we will scaffold a new app inside the offline_viewer folder. Run

yo electron

You can fill in the questions after executing the above command as applicable. Once the project is scaffolded and dependencies are installed, we will add a few more modules applicable to the offline viewer. Run,

npm install cheerio diskdb request socket.io --save

  • cheerio is to manage DOM manipulation in Nodejs. This module will help us sandboxing the content.
  • diskdb for data persistence.
  • request module for contacting the wordpress blog and get the posts as well as to sandbox images.
  • socket.io for realtime communication between the locally persisted data and the UI.

For generating installer for mac & some folder maintenance, we will add the below two modules.

npm install trash appdmg --save-dev

The final package.json should look like

Do notice that I have added bunch of scripts and updated the meta information of the project. We will be using NPM itself as a task runner to run, build and release the app.

To make sure everything is working fine, run

npm run start

And you should see

Screen Shot 2015-05-27 at 6.16.23 pm

Build the Socket Server

Now, we will create the server interface for our offline viewer, that talks to the WordPress JSON API.

Create a new folder named app at the root of the project. Inside the app folder, create two folders named serverclient. This folders kind of visually demarcate the server code vs. the client code. Here the server being Nodejs & client being Angular application. Inside the Electron shell, any code can be accessed any where, but to keep the code base clean and manageable, we will be maintaining the above folder structure.

Inside the app/server create a file named server.js. This file is responsible for setting up the socket server. Open server.js and update it as below

Things to notice

Line 1: We require the getport module. This module takes care of looking for an available port on the user’s machine that we can use to start the socket server. We cannot pick 8080 or 3000 or any other static port and assume that the selected port would be available on the client. We will work on this file in a moment.

Line 2 : We require the fetcher module. This module is responsible for fetching the content from WordPress REST API, save the data to DiskDB and send the response back.

Line 3 : We require the searcher module. This module is responsible for searching the locally persisted JSON data, as part of the search feature.

Line 7 : As soon as we get a valid port from getport module, we will start a new socket server.

Line 11 : Once a client connects to the server, we will setup listeners for load event and search event.

Line 13 : This event will be fired when the client wants to get the posts from the server. Once the data arrives, we emits a loaded event with the posts

Line 19 : This event will be fired when the client send a query to be searched. Once the results arrive, we emit the results event with the found posts.

Line 27 : Once the socket server is setup, we will execute the callback and send the used port number back.

Next, create a new file named getport.js inside the app/server folder. This file will have the code to return an unused port. Update getport.js as below

Things to notice

Line 2 : Require the net module, to start a new server

Line 3 : Starting value of the port, from which we need to start checking for a available port

Line 5 : A recursive function that keeps running till it finds a free port.

Line 10 : We attempt to start a server on the port specified, if we are successful, we call the callback function with the port that worked else, if the server errors out, we call the getPort()  again. And this time, we increment the port by one & then try starting server. This goes on till the server creation succeeds.

Next, create a file named fetcher.js inside app/server folder. This file will consist of all the business logic for our application. Update app/server/fetcher.js as below

Things to notice

* Most of the iterative logic implemented here is recursive.

Line 1 : We include the request module

Line 2 : We include the sandboxer module. This module will be used to sandbox the responses that we get from the WordPress API

Line 3 : We include the imageSandBoxer module. This module consists of logic to sandbox images. As in convert the http URL to a base64 one.

Line 4: We include the connection module. This module consists of the code to initialize the DiskDB and export the db object.

Line 6 : We set a few values to be used while processing.

Line 12 : The fetcher is invoked from the server.js passing in the online/offline status of the viewer and a callback. Since we are using recursions, we pass in a third argument to  fetcher()  named skip, which will decide if we need to skip/stop making REST calls to the WordPress REST API.

Line 13, 14 : We query the DiskDB for all the posts and meta data.

Line 18 : If meta data is not present, we create a new entry with the total value set to -1. Total stores the total post that the blog has.

Line 28 : We check if at least one batch of posts are downloaded and we can skip fetching from the REST API. If it is true, we check if all the posts are downloaded. If yes, we call  sendByParts() and send 20 posts at once.

Line 104 :  sendByParts() is a recursive function that send back 20 posts per second

Line 40 : If not all posts are downloaded, we send what we have downloaded so far and then call  fetcher() passing in the online/offline status, the callback and set skip to true. Now, when  fetcher() is invoked from here, the if condition on line 28 will be false. So, it will move on and download the remaining posts. Before we call  fetcher(), we will set the page value.

Line 52 : If we are loading the posts from page 1 again, we are removing all the posts and downloading again. This is as part of the POC. Ideally, we need to implement a replace logic while saving existing posts instead of removing the file and adding it again.

Line 58 : We check if the user is online and then only start making calls to the WordPress API.

Line 59 : We create a request to fetch the first page posts.

Line 67 : If it is a success, and we have more than one post in the response, we update the total post count that is sent by the API in our meta collection.

Line 76 : Once the meta collection is updated, we invoke the sandboxer(). The sandboxer() takes in a set of posts and sandboxes the URLs and content. And the sandboxed content is sent back.

Line 80 : We save the sandboxed posts to DiskDB and return the same to the UI. After that we increment the page number and call  fetcher(). As mentioned most of logic in application runs recursively.

Line 85 : If we are done downloading all the posts, we rest the page number and invoke the  imageSandBoxer(), which reads all the saved posts from DiskDB and converts the images with http urls to base 64.

Next, we will implement the sandboxer. Create a file named sandboxer.js inside app/server folder. And update it as below

Things to notice

* Most of the iterative logic implemented here is recursive.

Line 1 : We include cheerio

Line 2 : We create a few global scoped variables for recursion.

Line 9 : When the sandboxer is called, we reset the global variables and then call the  process() recursively till all the posts in the current batch are done.

Line 15 : Inside the process(), we check if a post exists. If it does, we start the sandboxing process, if not we execute the callback on line 78

Line 16 : We will access the content property on post and run the content through  cheerio.load(). This provides us with a jQuery like wrapper to work with DOM inside Nodejs. This is quite essential for us to sandbox the content

Line 19 : We sandbox the links. We iterate through each of the links and add a ng-click attribute to it with a custom function. Since I know I am going to use Angularjs, I have attached an ng-click attribute. Apart from that I remove unwanted attributes and reset the href, not to fire links by default.

Line 37 : If the anchor tag has children, I need to process them. In my blog, all the images are wrapped around with an anchor tag because of a plugin I use. I do not want that kind of markup here, where clicking on the image takes the user to the original images. So we clean that up and add custom classes to manage the cursor. All this is sandboxing links.

Line 52 :  For syntax highlighting, I am using an Angular directive named hljs. So, I iterate over all the pre tags in my content and set attribute on them that will help me work with then hljs directive. This is a typical example of integrating a third party angular directive with the sandboxer.

Line 58 : We sandbox all the iframe urls which have youtube as their src. By default all the youtube embed URL are protocol relative. They would look like //youtube.com?watch=1234. This will not work properly in the viewer, hence we will convert them to a http URL. Also, I am adding an ng-class attribute on the iFrame, whose src has youtube in it. This is to show or hide the iFrame depending on the network status as demoed in the video.

Line 69 : We sandbox the image tag, replace any additional parameters in the URL. This is specific to my blog. I have a plugin which adds this.

Line 73 : Once the sandboxing is done, we need to update the original HTML with the sandboxed version.

Line 75 : Call  process() on the next post.

To complete the sandboxing, we will be adding a new file named imageSandBoxer.js inside app/server folder. Update imageSandBoxer.js as below

Things to notice

* Most of the iterative logic implemented here is recursive.

imageSandBoxer() will be called only after all the posts are downloaded.

Line 1 : We require the connection module. This module consists of the code to initialize the DiskDB and export the db object.

Line 2 : We require cheerio

Line 5 : We require the request module. Do notice that we are setting encoding to null. This is to make sure we download image response as binary.

Line 12 : If there is no network connection, do not make calls

Line 15 : We processPost() by passing in the post and a callback. This recursively runs till all the posts are done processing.

Line 21 : We get all the images from the post and call the  sandBoxImage() by passing in one image at a time. This recursively runs till all the images are done processing.

Line 29 : If there is a valid image, get the image data. Convert the response to a base 64 format on line 32 and update the src attribute.

Line 39 : Once all images in a given post are done processing, we update the data in DiskDB and go to the next post.

Now, we will create a file named connection.js inside app/server folder. Update it as below

For DiskDB to work, create an empty folder named db inside app/serve folder.

To complete the so called server, we will create a file named searcher.js inside app/server folder. Update searcher.js  as below

Things to notice

Line 1 : We require the connection to DB

Line 6 : We fetch all the posts

Line 9 :  We run through each post’s title and content and check if the keyword we are searching for exists. if yes, we push it to the results.

Line 19 : Finally we send back the results

This completes our server.

Build the Angularjs Client

To work with client side dependencies, we will use bower package manager. From the root of offline_viewer folder run

bower init

And fill the fields as applicable. This will create a bower.json file at the root of the project. Next, create a file named .bowerrc at the same level as bower.json. Update it as below

This will take care of downloading all the dependencies inside the lib folder.

Next, run

bower install angular-material angular-route roboto-fontface open-sans-fontface angular-highlightjs ng-mfb ionicons jquery --save

If you see a message like Unable to find a suitable version for angular, do as shown below

Screen Shot 2015-05-28 at 12.50.33 am

Quick explanation of the libraries

  • jquery : DOM manipulation.
  • angular-material : Material Design in Angular.js
  • angular-route : The ngRoute module provides routing and deeplinking services and directives for angular apps.
  • angular-highlightjs : AngularJS directive for syntax highlighting with highlight.js
  • ng-mfb : Material design floating menu with action buttons implemented as an Angularjs directive.
  • roboto-fontface : Bower package for the Roboto font-face
  • open-sans-fontface – Bower package for the open-sans font-face
  • ionicons : The premium icon font for Ionic Framework.

The final bower.json would look like

Next, we will update index.html file present at the root of  offline_viewer folder.

Things to notice

Lines 7 to 13 : We require all the CSS files.

Line 16 : We require the server.js file.

Line 17 : We invoke app(), which starts the socket server on an available port. We get the used port as the first argument in the callback. We set that value on the window object.

Line 27 : We require jQuery using the module format

Lines 30 – 42 : We require all js files needed. We will create the missing files as we go along.

Line 43 : We set up the fab button. This is the search button on the bottom right hand corner of the viewer.

Note : We have not used ng-app on any DOM element. We are going to bootstrap this Angular app manually.

Next, open index.js and update as below

Do notice line nos. 15, 32 and 36.

If you want you can delete index.css present at the root. We will be creating one more inside the client folder later.

Next, we will add the missing scripts. Create a folder named js inside app/client. Inside the js folder, create a file named app.js. Update app.js as below.