• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Aaron Weiss

  • Home
  • Blog
  • About
  • Contact

linux

How FreeNAS and WP-CLI Grew My Interest in Linux and Automation

April 6, 2020 by Aaron Weiss

Last year, I built a FreeNAS server. Initially, it was only meant as a means to store my computer backups and house my music and videos.

However, to do it right, meant I needed to perform commands in the shell, mostly to test the hard drives before I began to store files on them. I found an excellent resource, but I didn’t know what any of commands meant. I executed them and waited until they were done.

The same was for Bash scripts to automate system configuration backups, reports, and notifications.

It was when I stumbled across a some YouTube videos on how to run an Ubuntu Server to host your own websites did I finally test the Virtual Machine waters FreeNAS offered. I installed  Ubuntu 18.04 Server LTS on a VM, and learned a little at a time. The idea that I could learn a new operating system without buying another computer floored me.

Setting Goals

With VMs, CLI, and some basic web server understanding under my belt, I was ready to take a leap and move aaronweiss.me to a Digital Ocean server, but with the following goals:

  1. Separate WordPress Environments:
    • Development (DEV): Any new plugins, theme enhancements, or other changes that would affect the WordPress installation or how the software worked would be developed and tested on WordPress installation.  Plugin, theme, and core updates would also be completed and tested on this server.
    • Quality Assurance (QA): This environment was meant to test any changes made in the DEV environment as if it were a functional website. No changes would be made to this environment except common WordPress functions such as adding and managing posts and pages.
    • Production (PROD): This would the live website visible to the public. Like QA, major changes would not be made on this environment.
  2. Automated Deployment Scripts: Deploy changes from DEV to QA and then QA to PROD
  3. Maintenance Scripts: Create a script to check for security vulnerabilities, cleanup temporary files, backup site, optimize database, and compress images on all three environments.

The above goals meant I could successfully, host, develop, and maintain my website using a secure approach with lots of ways to quickly get up to speed if something were to happen.

Additional Achievements Unlocked

Once I achieved these goals, I was hooked on what else I could do. My next set of goals were:

  • Create an automated Digital Ocean snapshot script. Digital Ocean has a backup options, but only does so once per week. That didn’t fly with me, so I wrote DOCTL Remote Snapshots as a way to have some control of how often and how many snapshots would be created.
  • Learn GIT – I’ve had some Git knowledge through Microsoft Team Foundation Server at work. However, it was time to really learn Git. I combined this with my DOCTL Remote Snapshot script and now have a published repository.

Next Up:

  • Create a website monitoring script. I don’t need server up time, I need to know website up time. I want to know that my website can fully load and perform its basic tasks throughout the day.
  • Build a Raspberry Pi and install:
    • PiHole. PiHole is an free, open source ad blocker.
    • NUT (Network UPS Tool). The goal of this is a script to monitor two computers from Raspberry Pi and shut them down gracefully using one Uninterruptible Power Supply. I currently have two UPSs, one for my primary computer and one for my FreeNAS. The primary one can handle up to 850 watts which is enough to cover all my devices, but only has one UPS port to monitor the primary device. Ideally, NUT will allow monitoring over Ethernet and can handle the shutdown of both machines.
    • Additionally, these two programs also feed my yearning to want to build and learn Raspberry Pi.

These are some short-term goals that I think are obtainable for the future.

Filed Under: Website Administration Tagged With: Digital Ocean, DOCTL, linux, ubuntu, virtual machine, wordpress

Automated DigitalOcean Snapshots with DOCTL

December 22, 2019 by Aaron Weiss

DigitalOcean snapshots are a blessing if you’re clumsy like me. They’ve allowed to me to recover from my mistakes, and even a hacking situation.

However, I’ve been disappointed with one aspect of DigitalOcean. Their backup plans for Droplet only create one backup per week and you cannot schedule yourself. They have snapshots which can be created ad hoc through their dashboard, but that’s not a way to live life.

I discovered DigitalOcean has it’s own command line interface (CLI) called DOCTL which allows you to access your DigitalOcean account and droplets remotely on a Linux machine.

After learning about this, I immediately wanted to leverage this with the following goals:

  1. Shutdown the server
  2. Take a snapshot as that is safer and reduces the chance of corrupted files
  3. Reboot the server
  4. Once the server is back on and live, delete the oldest snapshot if there are more than a certain amount.

This would keep my server lean and have a two backups a week for a maximum of 4 weeks if you consider their backup plan.

Table of Contents

  • Introducing DOCTL Remote Snapshot
  • Installation and Authentication
    • Install DOCTL on a separate Linux installation
    • Obtain the DigitalOcean API Key
    • Authenticate Your Account
  • The DOCTL Remote Snapshot Script
    • Configure your Snapshot
    • How to Execute the Script
    • Cronjob
  • Conclusion

Introducing DOCTL Remote Snapshot

The DOCTL Remote Snapshot script I’ve created is among several firsts for me:

  1. Learning Git and using Github
  2. Using DOCTL
  3. Publishing and maintain a public repository

I’m proud of this script, and I’ll be continuing to improve upon it. With no further ado:

Installation and Authentication

Install DOCTL on a separate Linux installation

Since our script will require us to shut down the droplet to prevent any corruption with our DigitalOcean snapshots, we’ll need a separate machine to make the remote calls and schedule the script via cron. If you run this on the same Droplet, it will shut itself off and that’s it. That is because with the server off, the rest of the script cannot be executed.

As an exmaple, I have a separate Ubuntu Virtual Machine (VM) running on my FreeNAS server that I setup specifically for cronjobs to execute scripts for remote services such as this script.

If you’re installing a fresh Ubuntu 18.04 LTS Server install, you can opt to have DOCTL installed alongside the server from the get-go. Otherwise, you’ll need to follow the GitHub documentation to install it. There is also this super awesome community-written guide.

Obtain the DigitalOcean API Key

This is the first step as it will be required to connect your script with your DigitalOcean account.

  1. In your dashboard in DigitalOcean, visit the API page: https://cloud.digitalocean.com/settings/api/tokens?i=74b08a
  2. Generate new token

    DigitalOcean Generate API Key
    Click here to enlarge
  3. Enter the name

    Name API Key
    Click here to enlarge
  4. Copy the new token

    DigitalOcean API Key
    Click here to enlarge

Authenticate Your Account

On your remote server, run the following command:
sudo doctl auth init

Then you’ll be prompted to enter your key from the first step. Once that is complete, you’re ready to use DOCTL on your server.

The DOCTL Remote Snapshot Script

Next, you’ll want to run the following command in the directory where you’d like this script to be executed.
git clone https://github.com/aaronmweiss/DOCTL-Remote-Snapshot.git

This will extract the GitHub repository to a directory titled DOCTL-Remote-Snapshot.

Configure your Snapshot

You’ll then want to edit the dodroplet.config file and supply the following:

Variable Name

Variable Explaination

dropletid=
Your Droplet’s ID. If you do not know your droplet’s ID, log into your DigitalOcean account, click on the droplet, and the URL of your droplet will contain your Droplet’s ID after the /droplets/ directory, like so: https://cloud.digitalocean.com/droplets/XXXXXXXXX/graphs?i=78109b&period=hour. The “XXXXXXXXX” in the URL string is your droplet’s ID.
numretain
Enter the number of snapshots you’d like to keep as a positive integer.
recipient_email
The email address you would like to recieve completion notifications.
snap_name_append
Optionally add additional information to the end of the snapshot’s name.

With the configuration file ready to go, you’re now ready to remotely execute DigitalOcean snapshots.

How to Execute the Script

Now it’s time to the run the script. To be the on the safe side, let’s be sure the script can be executed:

sudo chmod +x auto_snapshot.bash

Now it’s time to run the script:

sudo bash auto_snapshot.bash

Give this some time to run. Once the droplet is powered off, it will create the snapshot which is the longest part of the process. According to DigitalOcean “Snapshots can take up to 1 minute per GB of data used by your Droplet.” Although, I’ve found it might take much longer.

Once the snapshot is complete, the droplet is powered on again. The any snapshots beyond the value in numretain will be deleted. You can use the -r flag to bypass any snapshot deletions.

After this is complete, a notification is sent to the user’s email supplied in the dodroplet.config file.

Cronjob

This works best as a cronjob. You can do this by running sudo crontab -e and entering something similar to:

* 1 * * 3 /bin/bash /home/$user/autoscripts/auto_snapshot.bash

Where $user is your username on your Linux machine. Although, you might want to use this in another location on your Linux installation.

You can use Crontab Generator to generate the cronjob command for you.

Conclusion

That’s it. This is my first published Bash script and GitHub repository. I’m extremely proud of this script, although it’s rather simple. It accomplishes a need that I had that was not readily available elsewhere.

It is my hope that you’re able to use this to automate your DigitalOcean snapshots and ensure your droplets are safe so you can continue to build up on your projects. Feel free to fork it, contribute, and comment on Github.

This article was updated on March, 10th, 2020

Filed Under: Projects, Website Administration Tagged With: bash, Digital Ocean, digitalocean, DOCTL, git, github, linux, snapshots, web server

Plugin Update Failed: How To Fix WordPress Plugin and Theme Permission Errors

December 6, 2019 by Aaron Weiss

Recently, when attempting to update some plugins and themes on my WordPress installation, I can across two errors:

Failure Updating Plugins

An error occurred while updating <plugin name>

WordPress plugin update failure: Update Failed: The update cannot be installed because we will be unable to copy some files. This is usually due to inconsistent file permissions.
Example of a plugin update failure. Click to see an enlarged version.

And…

Failure Deleting Plugins

Plugin could not be deleted due to an error: Could not fully remove the plugin(s) my-plugin/my-plugin.php

Could not fully remove the plugin.
Example of a plugin deletion failure. Click to see an enlarged version.

The root of the problem is that same: permissions. I’ve found two ways to correct this if your WordPress website is hosted on a Linux server and you have shell access.

This error and solution is the same for WordPress themes too.

  1. Update the plugins or themes using WP-CLI
  2. Run a Linux command

Update Plugins or Themes with WP-CLI:

WP-CLI is a command line interface for Linux that allows you to complete WordPress tasks and functions using the Linux command line. This can be faster than completing the same many functions tasks within the WordPress Dashboard.

Built into WP-CLI is the ability to update plugins, themes, and even the core WordPress installation from the command line. This circumvents the problem because there are no interfaces with a browser, and therefore no need for the actual web server (usually Apache) getting involved.

  1. Install WP-CLI or use a host that has it installed by default like a2 Hosting.
  2. Run wp plugin update or wp theme update.

However, this is not a permanent solution. But learning the WP-CLI and creating scripts can increase the speed of completing WordPress functions.

Run a Linux Command

This is the best option because we’re actually fixing the problem.

sudo chown -R www-data /path/to/plugin

Replace “/path/to/plugin” which is usually something along the lines of /var/www/html/wp-content/plugins/ewww-image-optimizer, which was an example of the problem I faced.

This will give ownership of the directory back to Apache so that when you update the plugin again through WordPress’s update features, it should work permanently.

Conclusion

These are two great ways in which to correct the ownership issues with WordPress plugin and theme updates. While option 2 is clearly the permanent solution, updating using an automated script found in option 2 might be more helpful if you want to save time maintaining your WordPress website.

Filed Under: WordPress Tagged With: cli, command line interface, linux, linux shell, wordpress, wordpress errors, wp-cli

Primary Sidebar

Recent Posts

  • TrueNAS Virtual Machine with Ubuntu 18.04 Halting with Signal 11
  • Proxmox ZFS Disk Replacement and Drive Expansion
  • How To Create ZFS Backups in Proxmox
  • Multiple UPS on the same NUT-Server
  • Learning Graylog for Fun and Profit

Categories

  • Film & Television
  • Google Analytics
  • Guitar
  • Lifestyle
  • Projects
  • Proxmox
  • Search Engine Optimization
  • Technology
  • TrueNAS
  • Uncategorized
  • Website Administration
  • WordPress
  • Home
  • Blog
  • About Aaron Weiss
  • Contact
  • Privacy Policy
  • Affiliate Disclosure

© Aaron Weiss. Built with WordPress using the Genesis Framework. Hosted on a2 Hosting and DigitalOcean. Aaron Weiss's LinkedIn Profile