Create a cron job inside a Docker container to execute a function in run.js file – Docker

by
Ali Hasan
boot2docker cron docker-compose node.js ubuntu

The Problem:

Dockerfile creates an Ubuntu container with Node.js installed and a project cloned into it. The project contains a run.js file that calls a function in user.js which logs a message to the console. A cron job is set up to execute this run.js file every minute. However, the console log message is not appearing on the terminal.

The goal is to ensure that the cron job executes the run.js file and prints the console log message on the terminal every minute within the Docker container.

The Solutions:

Solution 1:

Create a cron job by adding the following line to the entrypoint.sh file:

echo "*/2 * * * * cd /app/api && /usr/bin/node -e \"require('/app/api/run.js').users()\" >> /var/logfile_new.log 2>&1" | tee /tmp/cron_job.txt

This line creates a cron job that runs every 2 minutes and executes the users() function from the run.js file. The output of the cron job is redirected to the /var/logfile_new.log file.

To view the cron job logs, you can use the tail command:

tail -f /var/logfile_new.log

Solution 2: Use a separate container for the cron job

Normally, a Docker container runs a single process. If you need multiple processes, you should create multiple containers. Since the cron daemon is a separate process, it requires its own container.

To set this up, add the following to your docker-compose.yml file:

version: '3.8'
services:
  node-backend:
    build: .
    ports:
      - 3002:3002
  cron-run:
    build: .
    command: cron -n

This will create two containers: one for your Node application and one for the cron daemon.

To improve efficiency, build your application only once in the Dockerfile. Move most of the code from the entrypoint script into the Dockerfile. Avoid running git inside Docker; add the docker-compose.yml and Dockerfile to your application’s source repository and build whatever is currently checked out.

Here’s an example Dockerfile:

FROM node:16-slim  # based on Debian

# Get the cron daemon
RUN apt-get update \
 && DEBIAN_FRONTEND=noninteractive \
    apt-get install --no-install-recommends --assume-yes \
      cron

# Install the application (boilerplate Node app setup)
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci
COPY ./ ./
# RUN npm build

# Set up the cron job
RUN echo '* / 1 * * * node /app/api/run.js' | crontab

# By default, run the application
CMD npm start

Now, there is only one process per container. We don’t launch background processes or try to "start services." We also don’t need to "keep the container alive." The two containers will remain active as long as their processes are running.

Solution 3: Using tee Command to Print Output in Console and File

In the entrypoint.sh file, replace this line:

echo "*/1 * * * * node /app/api/run.js" > /tmp/cron_job.txt

with this one:

echo "*/1 * * * * node /app/api/run.js" | tee /tmp/cron_job.txt

The tee command allows you to redirect the output of the command to both a file and the console. In this case, the output of the command will be written to the /tmp/cron_job.txt file while also being displayed in the console.

This will ensure that both the container log and the console output display the results of the cron job when it executes.

Q&A

How can I debug the cron job that is silent not printing logs to the console

Use tee command to redirect output to both the console and a log file.

What is the correct way to set up a cron job in a Docker container

Use a separate container for the cron daemon and build the application only once in the Dockerfile.

Why does the cron job not appear on the console

The output is rerouted to a file using the ">" character.

Video Explanation:

The following video, titled "Django-Crontab - Collecting Data with Scheduled Functions in ...", provides additional insights and in-depth exploration related to the topics discussed in this post.

Play video

In this video, we will use django-crontab package to collect cryptocurrency data from an API every minute, and store that data in the ...