Early Return in Clojure
  •  03 January 2016
  •  misc 

Okay
Okay
Okay…
It’s better that you break your function into smaller ones, each does one simple purpose. Clojure is functional, isn’t it?

I’m just kidding. Sometimes it’s really hard to write such that code. Consider this example, I have a function for validating whether a string is a valid date time string. If it’s nil or blank, just skip it, otherwise, try parsing it to see if it’s okay.

(defn validate-date-time [date-time]
  (if (nil? date-time) true
      (if (blank? date-time) true
          (try (f/parse formatter date-time)
               true
               (catch Exception e false)))))

Nested, nested and nested. If this is still simple and easy to see for you, try this one, need to check if the date time is between 1970 and 2030

(defn- validate-date-time [date-time]
  (if (nil? date-time) true
      (if (blank? date-time) true
          (let [date-time (f/parse formatter date-time)]
            (if (nil? date-time) false
                (if (before-1970? date-time) false
                    (if (after-2030? date-time) false
                        true)))))))

Ehhh…

Read more

Recently, I have been working with projects using docker on Mac OS using docker machine. However, docker machine currently does not support fixed ip address for the machine so that everytime the virtual machine boots up, it is assigned with a new ip address. That makes accessing the docker containers running inside the machine a bit annoying since I have to use docker-machine ip command everytime to retrieve the ip of that machine and connect using the ip like http://192.168.1.100:8888.

One simple solution for this is to define a fixed host name with the ip in the hosts file. This little shell scripts utilizes sed and tee to dynamically update the host name and ip of the docker machine everytime you boot up that vitual machine.

#! /usr/bin/env sh

# remove the old ip in hosts file
sudo sed -i "/\b\(hostname\)\b/d" /etc/hosts

# insert the new ip
echo "$(docker-machine ip machine-name) hostname" | sudo tee -a /etc/hosts

# set env variables
eval "$(docker-machine env machine-name)" OR $(docker-machine env machine-name)

You will need to replace hostname with the server name you want to assign to that docker machine and replace machine-name with the name of the docker machine.

This script will first find and remove the old entry that containing hostname in the hosts file. Next, it will append a new entry to the hosts file by evaluating the docker-machine ip command to get the new ip. Finally, updates all the environment variables for the current session for the docker and docker-compose to work properly. Keep in mind that you need to run this script using source for the docker-machine env command to take effect for the current shell.

Read more

In my previous post Using Gulp with Browserify and Watchify - Updated, I presented a solution for setting Gulp with Browserify and Watchify using vinyl-source-stream. However, that method is no longer working as Browserify updated to version 8.0.2. This post will demonstrate a new updated solution that has been tested on Browserify 12.0.1 and Watchify 3.6.0.

Structure

In my project, I will have to folder named js for containing all the source .js files and another folder called dist for outputing all the bundles after built.

├─┬ js
│ ├─┬ page1
│ │ ├── display.js
│ │ └── model.js
│ ├─┬ page2
│ │ └── controller.js
│ ├─┬ util
│ │ ├── validation.js
│ │ └── notification.js
│ └── page1.js
│ └── page2.js
├── dist
└── gulpfile.js
Read more

nvm is my favorite tool for installing and working with Nodejs. I can install several Nodejs versions on one machine for different projects without affecting each other because nvm can install Node locally (without root privilege) for each project user. However, since nvm is a collection of shell functions, it can cause problems for using it with non-interactive environments (for example in automation tools like Ansible).

I found some work around for it which I will present in this post. Some of them are a bit ugly but at least they solve the problem. I’m still trying to find the best solution and will post here when available.

Install Node with nvm

As I mentioned before, nvm is a collection of shell functions, so if you call nvm directly, you will receive the error saying that it cannot find the nvm executable file. I tried sourcing it in .profile and use the Ansible’s shell module but still got the error. Finally, I came up with the solution that is to source the nvm script directly everytime I need to run nvm using one specified shell (bash in this case). The Ansible tasks for installing Nodejs using nvm will look like this

# nvm_user: the user with .nvm install

- name: install nodejs using nvm
  sudo: yes
  sudo_user: ""
  command: bash -c '. ~/.nvm/nvm.sh; nvm install '

- name: set default node version
  sudo: yes
  sudo_user: ""
  command: bash -c '. ~/.nvm/nvm.sh; nvm alias default '
Read more

Neo4j and Emacs

Neo4j is one of the most powerful graph database. However, support for Neo4j in Emacs is still limited. Luckily, with the help of comint-mode, we can easily create a custom inferior shell for executing query and retrieving the result from a running Neo4j instance. This article from Mastering Emacs illustrates how to write your own command interpreter in Emacs. Based on that guide, I have developed a new package for Emacs to simplify the steps of composing and testing cypher query command. This post will summarize my experience and my setup to interact with Neo4j from Emacs.

Cypher mode

First, of course you need a major mode for displaying and editing the cypher query command. You can easily install it using package.el. The mode will automatically associate any files with .cyp and .cypher extension so you don’t need to do any thing after installing it. It also supports basic indentation beside the must-have syntax highlighting.

cypher-mode

Read more

Firewall on Linux server

I’m working on a project that uses Clojure. Usually, for these kind of project, I will open up an nREPL server for inspecting the web app while fixing bug. However, the problem is that the nREPL server seems to allow all kind of connections, from the local connections to the external ones without prompting for password. I also have a Neo4j instance running the graph database used for the website on another port and sometimes I need access to its web interface to look up the data inside the graph. That leads me to the need of setting up a firewall on my VPS to block all untrusted connections.

iptables seems to be the most popular firewall tool on Linux server out there. However, working with the complex iptables table rules through the command line can be a bit struggling. Fortunately, there is ferm, an utility tool that helps you maintain complex firewalls, without having the trouble to rewrite the complex rules over and over again. It allows the entire firewall rule set to be stored in a separate file, and to be loaded with one command.

Organize ferm with Ansible

Usually, the Ansible config for each project consists of some roles that is reusable among projects and the custom tasks for that project (as I mentioned before in this post Vagrant and Ansible - Organize for reusability). In order to apply that structure, we need to an Ansible role for installing ferm and generate a default config file for it. That config file should have some directives for including the per project config file. In each project, there will be some Ansible tasks in each project for defining particular firewall rule for that project.

Read more

Update: this method is outdated again. The new solution is presented here Using Gulp with Browserify and Watchify - Update Nov 2015

Old method

In my previous 2014-08-06-using-watchify-with-gulp-for-fast-browserify-build.md, I have demonstrated how to use Browserify and Watchify in Gulp to automate the build process. The steps are to create a browserify bundle and return that bundle with a bunch of pipe inside a gulp task like this

var b = browserify({
  cache: {},
  packageCache: {},
  fullPaths: true
});
b = watchify(b);
b.add('./main.js');
return b.bundle()
    .pipe(uglify())
    .pipe(gulp.dest(dest));

We have to manually add the main.js file into Browserify so it will become ugly and complex when you have multiple bundles to build, not just one main.js file. It would be much better if we can do something like this, passing the source files as a glob as we usually do with Gulp

return gulp.src(source)
    .pipe(buildBrowserify)
    .pipe(uglify())
    .pipe(gulp.dest(dest));

In this post, I will illustrate how to create that buildBrowserify function.

Read more

Vagrant and Ansible

Recently, I have started a personal project built with Clojure, a website for managing family records and visualizing pedigree tree. The problem is that I need an automation tool for setting up the development environment and deploying my website to the server. And yes, Vagrant is one of the best solution out there. I have used Vagrant with Chef before and found that Chef is a bit complex and requires installation of Ruby and Chef on the server before you can deploying anything.

After looking at all of Vagrant’s Provisioning solution, I decided to give Ansible a try because it is simple and operate over SSH, which means I need to install nearly nothing on the server to use it (of course you still can Ansible on server and do a local Provision there, but it’s only one more command to type).

This blog post is the summary of my experience with Vagrant and Ansible, how I set up my development environment and how can I re-use the code for other types of project. The sample project can be found at https://github.com/tmtxt/clojure-web-skeleton. Before you come to the next part, take a look at the basic usage of Vagrant and Ansible.

Basic Structure

A project with Vagrant and Ansible will look like this

├── Vagrantfile
├── ansible
│   ├── group_vars
│   │   └── all
│   ├── main.yml
│   ├── roles
│   │   ├── apt
│   │   ├── emacs
│   │   ├── git
│   │   ├── nvm
│   │   ├── oraclejdk
│   │   ├── postgres
│   │   ├── virtualenv
│   │   └── zsh
│   └── templates
│       ├── db_config.clj
│       └── system_config.clj
├── project.clj
├── src
└── static
Read more

Cover

PostgreSQL official documentation is one good resource for researching PostgreSQL features. However, the documentation on its home page are exported to PDF format only, which make it hard to read on other devices since it has no text-reflowable feature. Luckily, Peter Eisentraut made a small commit that add epub target to the documentation build to export the epub file.

Here are the building instruction and the download links to the epub file that I have built before

Read more

Image Blend modes and HTML Canvas

Image Blend modes are the methods used to determine how 2 image layers are blended (mixed) to each other. As the digital images are stored in pixel, which are represented by numerical values, there are a large number of possible ways for blending based on the mathematical functions. With the help of Canvas API, now we can easily retrieve the images, export all the pixels on the image, apply blending effect, calculate to get the new blended pixels, export and display the new image on the web or save to server.

In this post, I will demonstrate some basic steps for applying Image Blend modes using HTML canvas API.

The Logic

Wikipedia already listed some popular blending methods here Blend Modes.

Assume that we have two image layers, top layer and bottom layer. For each loop, a is the value of a color channel in the underlying layer and b is that of the corresponding channel of the upper layer, we got the function for calculating the new pixel ƒ(a, b).

The steps for generating the new blended image from the two images are

  • First, retrieve those two images
  • Create two separate canvases with the same size and draw those two images into the corresponding canvas.
  • Get all the pixel data from the two canvases.
  • Loop through each pixel, apply the blending function to create a new pixel and store it inside an array
  • Create a new canvas with the same size, use the blended pixel data to draw the blended image on that new canvas
  • Export the canvas to image, file to save to server or to local computer.

Note: I mentioned that we need to create two canvases with the same size. However, that is just to make it easy for the demonstration purpose. You can still generate two canvases with different size, but you will need to change the calculation function a bit.

Read more