Managing Angular Dependencies with Webpack

AngularJS is a powerful library that allows you to create some amazing applications with relative ease. Unfortunately as the project grows, organization and structure start to become an issue. I’ve played around with a few bundlers and decided that Webpack fit my needs on my large AngularJS projects. Below I’ll show you how you can manage your JS code using Webpack while automatically including new files and having the ability to output both expanded and minified code for your development and production environments.

Take the structure of a typical AngularJS application.

Structure

Project_Folder
├── js
│   ├── controllers
|   │   ├── BaseController.js
│   │   └── IndexController.js
|   │
│   ├── directives
|   │   ├── ScrollTo.js
│   │   └── SlideOut.js
|   │
│   ├── filters
|   │   ├── Ellipsis.js
│   │   └── Ordinal.js
|   │
│   ├── services
|   │   ├── Session.js
│   │   └── Validate.js
|   │
│   ├── templates
│   └── vendor
|
├── partials
│   ├── one.html
│   └── two.html
|
├── css
├── img
|
└── index.html

Nothing out of the ordinary. Directives, controllers, services, and filters are located in the appropriate folder to help keep everything nice and organized. Now it’s time to include all of your JS in your HTML page.

<script src="/js/vendor/jquery.min.js"></script>
<script src="/js/vendor/angular.min.js"></script>
<script src="/js/vendor/angular-animate.min.js"></script>
<script src="/js/vendor/angular-cookies.min.js"></script>
<script src="/js/vendor/angular-resources.min.js"></script>
<script src="/js/vendor/angular-ui-router.min.js"></script>
<script src="/js/controllers/BaseController.js"></script>
<script src="/js/controllers/IndexController.js"></script>
<script src="/js/directives/ScrollTo.js"></script>
<script src="/js/directives/SlideOut.js"></script>
<script src="/js/services/Session.js"></script>
<script src="/js/services/Validate.js"></script>

Wow. That’s a lot of boilerplate to include all of your files and this application is relatively small. Every time you add or remove a component you’ll need to remember to update your HTML. It’s not the worst problem in the world but it adds yet another step to coding your application. Also, for performance reasons you want to keep the number of requests on the production code to a minimum. This is the point where a lot of developers will use a build system like Gulp, Grunt, or my favorite Make (I’m a bit of a sadist) to concatenate and minify all of your development files so that you only have to make 1 call for production. Additionally, the build system would need to replace the above block of includes with the call to the new, minified file.

I find this approach to be clunky and error prone. Here’s how you would do the same thing with Webpack.

bundle.js

// /js/entry.js
// Expose any global variables you need for your application
window.$ = window.jQuery = require('./vendor/jquery');</p>

<p>// Include all of your Angular modules
require('./vendor/angular-animate.min');
require('./vendor/angular-cookies.min');
require('./vendor/angular-resource.min');
require('./vendor/angular-ui-router.min');</p>

<p>// Main Application
require('./app');</p>

<p>// Services
require('./services/Session');
require('./services/Validate');</p>

<p>// Directives
require('./directives/ScrollTo');
require('./directives/SlideOut');</p>

<p>// Controllers
require('./controllers/BaseController');
require('./controllers/IndexController');

Now you can start Webpack up and it will generate a bundle.js file everytime you save, add, or delete a JS file in your application. This is the only JS file you will need to include in your HTML. Additionally, Webpack can be configured to produce expanded code for development environments and minified code with the comments stripped for production use.

<script src="/js/bundle.js"></script>

Now you’ll still need to remember to update your entry.js file every time you add or remove a file. Luckily, there’s a simple trick to get make this even easier.

// jQuery
window.$ = window.jQuery = require('./vendor/jquery');</p>

<p>// Angular Modules
require('./vendor/angular-animate.min');
require('./vendor/angular-cookies.min');
require('./vendor/angular-resource.min');
require('./vendor/angular-ui-router.min');</p>

<p>// Main Application
require('./app');</p>

<p>// Services
var services = require.context('./services', true, /.js$/);
services.keys().forEach(services);</p>

<p>// Directives
var directives = require.context('./directives', true, /.js$/);
directives.keys().forEach(directives);</p>

<p>// Filters
var filters = require.context('./filters', true, /.js$/);
filters.keys().forEach(filters);</p>

<p>// Controllers
var controllers = require.context('./controllers', true, /.js$/);
controllers.keys().forEach(controllers);

If you want to create a new controller you just have to create the file and Webpack will take care of everything else.

Logentries Library for the Go Programming Language

After trying out a bunch of online logging services I’ve really come to enjoy and rely on Logentries so I whipped up a library for Golang to support the service. It uses the token-based input and has the option to use SSL.

Take a look below.

Github

Installation

go get github.com/robottokauf3/go-logentries

Basic Usage

    package main
    import (
        "fmt"
        "logentries"
    )
    func main() {
        // Using standard TCP connection
        logger, _ := logentries.New("1a2b3c4d-5e6f-7g8h-9i0j-1k2l3m4n", false)
        logger.Debug("Important Debugging Message")

        // Using SSL connection
        sslLogger, _ := logentries.New("1a2b3c4d-5e6f-7g8h-9i0j-1k2l3m4n", true)
        sslLogger.Debug("Secret Debugging Message")
    }

Recursively Include Routes in Node.js and Express

Organizing Node.js applications can be a chore. They either have massive route / controller files which cover way too many concerns or they are broken into smaller files with 50 million require statements.

Consider the following folder structure:

Project_Folder
├── config
│   ├── index.js
├── models
│   ├── Model1.js
│   ├── Model2.js
│   └── Model3.js
├── public
├── routes
│   ├── admin
│   │   └── index.js
│   ├── dash
│   └── index.js
├── views
│   ├── admin
│   │   ├── index.ect
│   │   └── accounts.ect
│   ├── dash
│   │   └── index.ect
│   └── layout.ect
├── package.json
├── README.md
└── server.js

This is great for separating individual components into easy to manage files that focus on a single task. Unfortunately, on larger projects you might end up with something like the following block in your server.js file:

require('./routes/admin/index.js');
require('./routes/admin/accounts.js');
require('./routes/admin/campaigns.js');
require('./routes/admin/idontevenknow.js');
require('./routes/dash/index.js');
require('./routes/dash/widgets.js');
require('./routes/dash/williteverend.js');
// ...

That’s a lot of boilerplate to simply include all of your routes. For the last few months I’ve been using the following routes/index.js file to save me from require spam:

// routes/index.js
var fs = require('fs'),
    validFileTypes = ['js'];

var requireFiles = function (directory, app) {
  fs.readdirSync(directory).forEach(function (fileName) {
    // Recurse if directory
    if(fs.lstatSync(directory + '/' + fileName).isDirectory()) {
      requireFiles(directory + '/' + fileName, app);
    } else {

      // Skip this file
      if(fileName === 'index.js' && directory === __dirname) return;

      // Skip unknown filetypes
      if(validFileTypes.indexOf(fileName.split('.').pop()) === -1) return;

      // Require the file.
      require(directory + '/' + fileName)(app);
    }
  })
}

module.exports = function (app) {
  requireFiles(__dirname, app);
}

The script recursively searches your ‘routes’ directory and loads all of your controller files. To load all of your controllers you simply need to require the ‘routes’ directory:

require('./routes')(app);

Your route / controller files would like something like this:

// routes/admin/accounts.js
module.exports = function (app) {
  app.get('/admin/accounts', function (req, res) {
    // Controller logic
    res.render('admin/accounts.ect');
  });
}

Enjoy!

Mkdir, chown, chgrp, and set permissions in one command.

I’ll probably revisit this script at another time to make it more flexible but it’s late so screw it. Have fun!

Go from this:

mkdir -m 777 directory_name
chown -R owner directory_name
chgrp -R group directory_name

To:

mkdirplus directory_name owner group 777

Just save the following in an executable file in your $PATH.

#!/bin/bash

helptext()
{
cat << EOF
usage: $0 DIRECTORY OWNER GROUP PERMISSIONS

Create a directory, set owner, group, and permissions at once.  Awesome.
EOF
}

while getopts “h:m” OPTION
do
  case $OPTION in
    h)
      helptext
      exit 1
      ;;
  esac
done

if [ "$4" ]
then
  mkdir $1 -m $4
else
  mkdir $1
fi
if [ "$3" ]
then
  chgrp -R $3 $1
else
  chgrp -R $2 $1
fi

  chown -R $2 $1
exit 0

Using Sass with Sails.js

If you’re not using Sass in your projects you should drop everything, read about it at http://sass-lang.com/ and start using it in your projects immediately. I’ve been using Sails.js a lot lately and enjoying it immensely. Unfortunately while it supports Sass’ cousin, Less, you have to make some configuration changes to get Sass to work.

Luckily you can do it in 4 easy steps.

1) Install Grunt Sass plugin.

npm install grunt-contrib-sass --save

2) Load the Sass plugin in Gruntfile.js

  // Get path to core grunt dependencies from Sails
  var depsPath = grunt.option('gdsrc') || 'node_modules/sails/node_modules';
  grunt.loadTasks(depsPath + '/grunt-contrib-clean/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-copy/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-concat/tasks');
  grunt.loadTasks(depsPath + '/grunt-sails-linker/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-jst/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-watch/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-uglify/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-cssmin/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-less/tasks');
  grunt.loadTasks(depsPath + '/grunt-contrib-coffee/tasks');
  grunt.loadTasks('node_modules/grunt-contrib-sass/tasks'); // Add this
    //Change the above line to match installation directory if not local          

3) Add Sass:Dev task

    sass: {
      dev: {
        options: {
          style: 'expanded' //Set your prefered style for development here.
        },
        files: [{
          expand: true,
          cwd: 'assets/styles/',
          src: ['*.scss', '*.sass'], // Feel free to remove a format if you do not use it.
          dest: '.tmp/public/styles/',
          ext: '.css'
        }, {
          expand: true,
          cwd: 'assets/linker/styles/',
          src: ['*.scss', '*.sass'], // Feel free to remove a format if you do not use it.
          dest: '.tmp/public/linker/styles/',
          ext: '.css'
        }
        ]
      }
    },

4) Update the ‘compileAssets’ task.

  grunt.registerTask('compileAssets', [
    'clean:dev',
    'jst:dev',
    'less:dev',
    'sass:dev', //Add this line
    'copy:dev',    
    'coffee:dev'
  ]);

and the ‘prod’ task.

  grunt.registerTask('prod', [
    'clean:dev',
    'jst:dev',
    'less:dev',
    'sass:dev', //Add this line
    'copy:dev',
    'coffee:dev',
    'concat',
    'uglify',
    'cssmin',
    'sails-linker:prodJs',
    'sails-linker:prodStyles',
    'sails-linker:devTpl',
    'sails-linker:prodJsJADE',
    'sails-linker:prodStylesJADE',
    'sails-linker:devTplJADE'
  ]);

While running ‘sails lift‘ any changes made to your scss / sass files will be compiled and included in your application.

This tutorial was written using Sails 0.9.7. Individual mileage may vary.

Configuring Postgresql with Sails.js

I’ve been playing with Sails.js a lot lately and was thrilled that it supported my favorite database, Postgresql. Unfortunately the documentation on configuring the adapter left a lot to be desired.

Installing Postgresql Adapter

Installation is extremely easy thanks to NPM.  Just enter the following command from the root directory of your project.


npm install sails-postgresql --save

Configuring the Adapter

1) Comment out or remove the code from the config/adapters.js files.  

2) Update your local.js file to look like the following:

//config/local.js

module.exports = {

  port: process.env.PORT || 1337,
  
  environment: process.env.NODE_ENV || 'development',

  adapters: {

    'default': 'postgres',

    postgres: {
      module   : 'sails-postgresql',
      host     : 'localhost',
      port     : 27017,
      user     : 'USERNAME',
      password : 'SUPER_SECURE_PASSWORD',
      database : 'DATABASE_NAME',

      schema: true //This makes sure that sails matches 
                   //the database schema to your models.
    }

  }

};

Techincally, you can place the configuration in the adapters.js file but I do not like this approach.

The local.js file is included in the .gitignore file by default so there’s less chance of accidentally publishing your username and password. Also, if you are working in a team or deploying to multiple machines you don’t need to worry about loading the wrong settings from the adapters.js file.

Compass – Set Cache Folder Location

By default Compass will place the .sass-cache folder in the project root directory. I prefer to keep all caches and temporary file builds for my projects in the .tmp folder.

Luckily, Compass has the ability to set the cache folder location but it appears to be undocumented.

Simply ass the following to the config.rb for the project:

cache_path = 'PREFERRED_PATH/.sass-cache'

# Eg.

cache_path = '.tmp/.sass-cache'

Note: The cache_path is relative to the config.rb file and not the project_path.
Tested with Compass v0.12.2

Vagrant Error: undefined method forward_port

Ran into an interesting issue while using VirtualBox 4.2.10 with Vagrant 1.1.0 on a Windows 7 machine.

All of the Vagrant documentation I found was telling me to use the following code in my VagrantFile to forward port 80 on the guest to port 8080 on the host machine.

config.vm.forward_port 80, 8080

When trying to bring the box up the following error was being thrown:

Vagrantfile:11:in 'block in <top (required)>': 
undefined method 'forward_port' for 
#<VagrantPlugins::Kernel_V2::VMConfig:0x28b2040> (NoMethodError)

I found tons of forum and blog posts which were stating that this was a problem with VirtualBox 4.2.10 and the fix was to roll back to the 4.1 series.

If you are using Vagrant 1.1 try the following syntax before downgrading. It turns out that there was an update to the configuration file made in that version of Vagrant

Here’s the new, working code:

config.vm.network :forwarded_port, guest: 80, host: 8080

Make sure to check out the rest of the official Vagrant Documentation. There’s way too many tweaks and updates to list here.

Guard LiveReload and CSS @import

Let me preface this post with the following statement: Don’t use @import for CSS files! It blocks parallel downloading so your browser has to wait for the CSS files to load before moving on.

Unfortunately, sometimes you are stuck using @import and if you are also using Guard::LiveReload you will notice that your browser is not using the updated stylesheets when you save them.

To fix that simply set the “apply_css_live” option as false in your GuardFile

guard 'livereload', :apply_css_live => false do
  watch(%r{/*.php })
  watch(%r{sites/all/themes/rok3/css/*})
  watch(%r{sites/all/themes/rok3/images/*})
end

Guard::LiveReload will now do a full refresh of the browser instead of applying the CSS live. It’s a little clunkier than when you are using CSS from link tags but at least you don’t have to hit F5.

Analytics Interceptor – Chrome Extension

Analytics Interceptor By Robert Kaufmann

What is it?

Analytics Interceptor is an extension for Google Chrome which allows you to block and monitor the tracking beacons being sent by various scripts on the site you are visiting.

AI is actually a solution to two separate problems I encountered on the same day. Firstly, I needed to move beyond unit testing on a Google Analytics wrapper I’m developing and I was sick of reading debug statements in the console. Secondly, I was involved in a discussion with a friend who was worried about polluting the GA data on a site which ‘couldn’t’ filter his IP from their profile.

Why use it?

It turns you into a freaking ninja!!! Is there a better reason for anything? You can lurk in the shadows testing out all sorts of analytics functionality on a website without skewing the information.

Currently, Analytics Interceptor only grabs the beacons for Google Analytics, Hubspot Page Tracking, and Google’s ad networks. Future releases should also track Piwik, Open Web Analytics, Omniture, and others.

Getting Started

To get started, just open the popup and click the red / green button to turn AI on for the current tab. It will display the number of beacons captured on the menu bar icon. The beacon URLs and an easy-to-read breakdown of the parameters is in the popup window.

NOTE: Interceptor stopped working with newer versions of Chrome so I’ve temporarily removed it until I can release an updated version.

Download from GitHub