How to sync Netpicker’s local Git repository for config backups with GitHub

Overview

This article explains how to sync data from a local Git repository inside a Docker container to a remote Git repository (e.g., GitHub), using a script executed from the host system. It also covers how to handle Git identity configuration issues and automate the sync process with a cron job.

Caution: Handling Sensitive Network Configuration Files

Important: Network configuration files often contain sensitive information, such as passwords, IP addresses, and network topology details. Before transferring these files to a remote Git repository, ensure the following:

  • Repository Access: Verify that the repository is private and that only authorized personnel have access to it.
  • Use of .gitignore: Exclude files containing highly sensitive information by using a .gitignore file.
  • Repository Location: Double-check that the files are being transferred to the correct repository to avoid accidental exposure.
  • Data Encryption: Consider encrypting sensitive parts of the configuration files if possible before committing them to the repository.

Always review your repository contents carefully before pushing them to ensure sensitive data is handled appropriately.

Prerequisites

  • Access to Netpicker api container.
  • Access to the GitHub account.
  • A personal access token (PAT) from GitHub with the necessary permissions to push to a repository.
  • Root access on the host system to run the script.

1.1 Create the Sync Script

The following script will allow you to sync a local Git repository inside the Docker container to a remote Git repository, ensuring Git user identity is properly configured for committing changes.

Shell Script for Syncing Data:

#!/bin/bash

# Check if the user is root, if not switch to root
if [ "$(id -u)" -ne 0 ]; then
    echo "Switching to root user..."
    sudo su
fi

# Define the container name
CONTAINER_NAME="api"

# Define the Git repository URL for the remote
REMOTE_GIT_URL="https://<YOUR_USERNAME>:<YOUR_ACCESS_TOKEN>@github.com/<YOUR_USERNAME>/<YOUR_REPOSITORY>.git"

# Get the path to the mounted Docker volume
VOLUME_MOUNT_PATH=$(docker volume inspect -f '{{.Mountpoint}}' $(docker inspect $CONTAINER_NAME | jq -r .[].HostConfig.Mounts[].Source | grep policy-data))/policy-workdir/configs/default/

# Set the Git user name and email locally for this repository
git -C "$VOLUME_MOUNT_PATH" config user.name "Your Name"
git -C "$VOLUME_MOUNT_PATH" config user.email "[email protected]"

# Add the remote Git repository if it's not already added
git -C "$VOLUME_MOUNT_PATH" remote add github $REMOTE_GIT_URL 2>/dev/null || echo "Remote already added."

# Add any new or changed files to the staging area
git -C "$VOLUME_MOUNT_PATH" add .

# Check the Git status
git -C "$VOLUME_MOUNT_PATH" status

# Commit the changes with a message
git -C "$VOLUME_MOUNT_PATH" commit -m "Syncing configurations to remote Git repository."

# Push the changes to the remote repository
git -C "$VOLUME_MOUNT_PATH" push github master

1.2 Explanation of the Script:

  • Root Access Check: The script checks whether the current user is root. If not, it switches to root using sudo su before continuing.
  • VOLUME_MOUNT_PATH: The script identifies the mount point of the Docker volume that contains the Git repository.
  • Git Config: The script configures the Git user name and email for this specific repository inside the Docker container. This resolves errors when committing changes.
  • Remote Git URL: Replace the placeholder with your GitHub repository URL, including your GitHub username and personal access token.
  • Git Commands: The script adds any new or modified files to the repository, commits the changes, and pushes them to the remote repository.

1.3 Running the Script Manually:

  1. Save the script to a file on your host system, e.g., sync_git.sh.

Make the script executable:
chmod +x sync_git.sh

  1.   Run the script manually:

    ./sync_git.sh
  2. This will sync the local Git repository inside the Docker container with the remote repository while ensuring the Git identity is correctly configured for commits.

Step 2: Automating the Sync Process

To ensure that the sync process occurs periodically, you can schedule the script to run automatically using cron.

2.1 Create a Cron Job

Open the crontab editor on your host system:
crontab -e

  1. Add a line to run the script at a regular interval. For example, to run the script every day at midnight, add the following line:
    0 0 * * * /path/to/your/sync_git.sh
  2. Save and exit the crontab editor.

This will automatically run the script every day at midnight to sync the data from the local Git repository inside the Docker container to the remote repository.

2.2 Verify the Cron Job:

To verify that the cron job has been set up correctly, list your cron jobs:

crontab -l

You should see the cron job listed, confirming that the script will run as scheduled.

Conclusion

By following this guide, you can sync data from a local Git repository inside a Docker container to a remote Git repository, ensuring that the Git user identity is properly configured to avoid commit errors. Additionally, you can automate this sync process using cron to ensure that data is regularly backed up to the remote repository.

Important: Always secure your personal access tokens (PATs) by using environment variables or a secrets management tool to prevent accidental exposure of credentials.

Additional Considerations:

  • Error Logging: Consider adding error handling and logging to the script to track any issues with the sync process.
  • Security: Ensure that only authorized users have access to the Git repository and the scripts, especially when working with sensitive configuration files.

By automating this sync process, you ensure that your local Git repository inside the Docker container remains up to date with the remote repository without requiring manual intervention.