Blog of Raivo Laanemets

Stories about web development, freelancing and personal computers.

Fullstack development tools


I sometimes write about specific tools I have found useful in my work but I have now decided to make a more complete summary. Somewhere around 2006 I wrote a similar summary (now long lost). At that time the toolset was mostly about backend development. The toolset has changed a lot and is now moved towards frontend as well.

Most of my projects are 3-12 month long and we work in teams of size 3-5. The projects are typically custom solutions and often have explorative and experimental nature. These projects usually have relative little specification and planning upfront and rely on the ease of changes (simplicity!) in the codebase.

I have divided the list in the following way:

Backend

I currently develop on two server-side platforms Node.js and SWI-Prolog. I have used Node.js since 2013 and SWI-Prolog since 2008. I used other languages and platforms before these.

Node.js

Node.js is a server-side JavaScript runtime. I started using it in 2013 after writing large amounts of front-end JavaScript. Node.js gained popularity during that time and my first attemps to use it as a server-side platform turned out rather well too. Since then I have built many web apps on the Express web framework.

Express web framework

The Express framework is a web framework consisting of "middlewares". A middleware is a functionality (often as a form of a function) inserted into the request processing stack. Example middleware includes cookie parsing, static file serving, authentication etc.

Express covers the following aspects of web programming:

  • Request URL path routing
  • Cookie handling (using a middleware)
  • JSON API responses
  • View (HTML) handling. It is template-agnostic but I prefer EJS

SWI-Prolog

SWI-Prolog is a Prolog implementation. Unlike the most other Prolog implementations, it comes with built-in modules for practical programming:

  • HTTP
  • ODBC (MySQL)
  • Unicode
  • Threads
  • JSON
  • HTML

Prolog is a niche language and mostly used in AI research projects. It has given me some unique opportunities. Besides doing paid work, I maintain an Open Source Blog-Core blogging framework for SWI-Prolog that this blog also runs on.

MySQL

So far I have built most of my projects onto MySQL. It is a simple universal SQL database, both in usage and maintainance, scaling from simple homepages to Facebook-scale social networks.

Frontend

I usually work with the Bootstrap CSS framework and the KnockoutJS library with some jQuery added into the mix.

Browserify

Depending on the type and size of the project, I choose whether to build a JavaScript bundle or even multiple bundles. If a bundle is built, it is handled by:

  • browserify - a CommonJS bundle builder
  • uglifyify - a browserify plugin, bundle output minification
  • exorcist - a browserify plugin, bundle source map extraction
  • brfs - a browserify plugin, file embedder
  • Makefile - runs browserify build commands

JShint

An useful tool is a lint to check JavaScript files for syntax errors before deploying to the server. I prefer jshint for that. It also does check for missing semicolons, bad operators and other constructs in code that are not strictly syntax errors but usually indicate some sort of mistake.

CSS Lint

CSS Lint is a lint to check CSS code. It runs syntax check and warns about common issues in the code, helping to write more readable and maintainable CSS code.

OptiPNG & jpegoptim

I optimize the frontend images using OptiPNG and jpegoptim. If the application itself features image uploads then these tools will be inserted into the image-processing stack on the server. I have seen OptiPNG reduce image sizes 50% and more (removing alpha channel on non-transparent images already saves 25%). This considerably speeds up page loading times.

Less CSS compiler

If there is a considerable amount of CSS, besides Bootstrap's own files, I prefer to use the Less compiler that helps to modularize and minimize final CSS code.

Other platforms

Besides custom web applications, I sometimes work on homepages and e-commerce sites and do web scraping.

WordPress/Magento

If a project fits the scope of a simple blog then it makes sense to use WordPress or if the project is an e-commerce site then Magento would work. These solutions work best when all wanted features can be covered by a small number of existing high-quality plugins.

PhantomJS

A set of projects have required some web scraping. Last year I had 3 such projects. My choice for scraping is PhantomJS which is a JavaScript-driven browser. Scraping is sometimes the only way to get a machine-readable (JSON or CSV or Excel file) dataset from a web page.

Text editor

My favourite text editor is Kate. The close second ones are Caret (Chrome based), Sublime Text and Notepad++ (unfortunately Windows-only). Kate is the main reason why I still want a KDE-based desktop.

On terminal I prefer nano to manage servers over SSH.

Project tracking

I prefer project management done through project trackers. A project tracker is an environment that allows to:

  • Plan and record activities
  • Store documents and other files
  • Integrate with code repo
  • Cross-reference everything
  • Share everything

Redmine

My more experienced clients usually want the project to use their preferred solution, like GitHub, Bitbucket or Jira. If the client has no preferred solution then we use Redmine. Redmine is a self-hosted project tracker and is way more powerful and flexible than most commercial and Open Source solutions. I also use Redmine as my own personal data organizer and maintain an installation for myself and clients.

Build system

Most of my projects use Make (GNU Make) as the build system. I use Make mostly as a way to write down commands for individual tools, such as bundlers, test runners and linters. Languages without compilation do not require much from the build system.

I generally add commands into the Makefile for these:

  • Single-command deploy (into test and production)
  • Running required test infrastructure
  • Running tests
  • Building a docker image (calls docker which uses Dockerfile)
  • Linting JavaScript files (jshint)
  • Linting CSS files (csslint)
  • Building frontend JavaScript bundle(s)

Testing

I usually write two types of automated tests: unit tests and acceptance tests.

Mocha & plunit

Unit tests, being conceptually lower level, are good for libraries and algorithmic pieces. I avoid testing trivial properties as the amount of unit testing code can be very high. For complete applications I rarely write unit tests but use them for libraries.

  • Mocha for unit-testing Node.js projects
  • plUnit for unit-testing SWI-Prolog projects

CasperJS

In web application projects I usually implement main stories as CasperJS test cases. CasperJS is a testing framework built on top of a scriptable web browser (PhantomJS). This is the highest level of testing and involves all layers of the application from web servers down to databases and 3rd party integrations which makes it quite effective.

Mail testing

MailDev is a local SMTP server with REST API. It is perfect to be used with CasperJS. A CasperJS test script executes some action that causes mail to be sent. Whether the mail was sent, is trivial to check through the MailDev API in CasperJS.

Manual testing

There are not many tools to mention about manual testing but I sometimes click through applications in IE 11 running on Windows 7 in VirtualBox (my main desktop runs Slackware 14.1). I'm currently using JavaScript ES5 and browser disparencies are not that bad, especially with most projects targeting IE 11+. Some years ago I developed a desktop application and VirtualBox+manual testing played a lot stronger role in it.

Deployment and hosting

I deploy exclusively to Linux-based systems and strongly prefer Debian. The server machines are either from a PaaS (PHP, Node.js) or IaaS (generic VPS) provider. In both cases I favor these remote access tools:

  • SSH remote terminal/shell access
  • rsync efficient difference-based file copy utility
  • git version control system

All of them are command-line based. This makes it easy to automate them. This enforces correct workflow and minimizes human errors and does not waste time that would otherwise go into activities that cannot be automated.

Supervisor

I run my application processes with the Supervisor process manager. It is compatible with all widely deployed init systems, including SysV init and Systemd. It runs in Docker, too.

Papertrail

I prefer Papertrail logging service. The logs are sent over rsyslog protocol and are monitored in realtime for specific patterns. I usually monitor the application error log and the web server access log. Papertrail allows to correlate them and that makes debugging quite easy.

Server uptime

I also monitor web server responses, for pre-selected set of URLs. This is handled by a shell script. The script writes OK/FAIL output for each check into a log file. The log file itself is monitored by Papertrail.

Docker

Docker is an application container system. It allows to package an application together with the underlying platform (like Node.js application installed inside Debian) in an efficient way. While I do not use Docker for production deployments yet, I have used it for all my test deployments in the last 2 years. I expect Docker to be the main deployment method for custom applications in the close future, replacing lower-level VPS systems in many use cases.

Offsite backup scripts

Most of the servers that I maintain, have offsite backups on my central backup server. I have never had hosting server crashes but there have been cases when a single overwritten file had to be restored. Using my own backups in such cases have been a lot faster than restoring a full backup or asking the hosting provider to restore it.

I use a combination of shell scripts and remote rsync and SSH command to incrementally backup the servers. The backups are stored on a LUKS-encrypted disk partition.

More stuff...

These are the main tools I use. There are way more but I did not write about them, just to be able to finish up the article. I did not document alternatives but I have tried LOTS of them. I do not claim that these tools are the best or that their combination is good or guarantees success. However, they have worked well for me.


Comments

No comments have been added so far.

Email is not displayed anywhere.
URLs (max 3) starting with http:// or https:// can be used. Use @Name to mention someone.