DevOps

environment:

admin  

name: production

artifacts: paths: - ftp_transfer.log - ftp_error.log - ftp_detailed_transfer.log only: - master


## Deployment Strategy Explained

1. Installing Dependencies: We update the system and install lftp, a command-line file transfer program that supports FTP, FTPS, and other protocols. This tool is essential for our file transfer operations.
2. Executing the FTP Transfer: Using lftp, we set up a mirror command that syncs our local ./php/ directory to the remote FTP destination. We included several options like --reverse for uploading, --delete to remove obsolete files, and --parallel=10 to accelerate the process by running multiple transfers simultaneously. Crucially, we exclude Git-related files and logs from the transfer to keep the remote site clean.
 * We use `lftp` which is an enhanced version on top of the ftp protocol, allowing deleting files on the destination that do not exist on the source.
 * We use YAML Block Scalar `|`(pipeline char) to mark in the yaml that a multi line is startng and Line Continuation with `\` to split the ftp command on multiple lines, in order to improve readability
 * We set `xfer:log` and `xfer:log-file` to produce a detailed log file that we print in the console
 * `mirror -v ./php/ $FTP_DESTINATION --reverse`: mirrors from local to remote(if no --reverse is specified it mirrors from remote to local)
 * we use `--exclude-glob` to exclude a bunch of files
 * `> ftp_transfer.log 2> ftp_error.log` redirect the console output to ftp_transfer.log and the error to ftp_error.log
 *  `|| LFTP_EXIT_CODE=$?` captures exit status; the special variable $? holds the exit code of the last executed command in Unix-like operating systems, which in this case is `lftp`. We prevent the `lftp` sending out an error exit code, because the pipeline script will be stopped on error, but we want to display and record the result. So, after displaying everything, we check the output of the lftp command and if exited with and error, we exit the script with an error.
3. Logging the Process: We redirect all output to ftp_transfer.log and specify a detailed log file detailed_transfer.log for in-depth transfer details. The detail log is created only if there files transfered/deleted, this is why we need to add a condition before printing it's content in console. Using `cat` on a non existing file would throw and error.
4. Artifact Handling: We specify ftp_transfer.log as an artifact, which means GitLab saves this log for later viewing, aiding in debugging if needed.

## The Outcome

This pipeline significantly improved our deployment process, making it faster and more error-resistant. Automating the upload via FTP ensured that human errors were minimized, and using lftp provided robustness against connectivity issues. Most importantly, configuring the pipeline to run only on the master branch meant that our production environment was always in sync with our most stable codebase.

## Lessons Learned

Automation and Visibility: Automating repetitive tasks like deployments not only saves time but also reduces the chances of errors. Additionally, detailed logs provide visibility, which is key in maintaining operational stability.
Security Practices: Handling sensitive information securely in CI/CD pipelines is vital. We used environment variables for FTP credentials and ensured they were never exposed.
    DevOps

Converting a Database From Mysql to Sqlite

Check how to convert a database from MySql to Sqlite using mysql2sqlite and a docker container with MySQL and Adminer

Implementing a Simple REST API in ExpressJS and Consuming it from Client, Server or using CURL/Postman

In this tutorial we're going to implement a simple REST API in nodejs, exploring different options, by installing express directly as a npm package or using the express generator. Then we'll consume the api from browser, curl, backend or curl