Tag Archives: backups

What’s Up, Home? – How to secure your (home) monitoring

Post Syndicated from Janne Pikkarainen original https://blog.zabbix.com/whats-up-home-how-to-secure-your-home-monitoring/26241/

As my home monitoring experiment has become such a celebrity and as it has so much going on, I’m trying to make sure I won’t ever lose its configuration and data, no matter what would happen. Here are some tips and reminders for you, too.

In the IT world, so much can go wrong. Hardware can die, files can become corrupt, malware can hit you, hackers can breach your systems, buggy software updates can cause havoc, you can fat-finger some commands or click on the wrong place and remove data… the list is endless. Here are some ways I’m attempting to protect my home monitoring environment.

Keep software updated

First things first. In today’s malicious world, it’s mandatory that you keep your software updated. With Linux and Zabbix, you don’t have an excuse to skip the updates. Updating your systems is fast and trouble-free. In small environments like home, even major Zabbix upgrades are fast as the database is not very big, so the database migrations that in a corporate environment can take time, will go through in minutes if not faster.

Remember to backup

Keep your backups in good shape. In my case, I do take backups with BackupPC and monitor my backups with Zabbix.

But I don’t trust one environment. What if my BackupPC says Kaboom? A nightly cron job also copies backup archives from my Raspberry Pi to my Mac, which in turn mirrors the backups to my iCloud. And, there’s one more cloud service I’m using for all my backups but not mentioning it here by name.

Test your backups

As long as you have not verified that your backups do actually work, you do not have a working backup. Have a virtual machine into which you can try to restore your backups. See if they work. Test them periodically, either manually or figure out an automated way to do that. 

In the case of Zabbix, you can make your primary Zabbix monitor your test environment, and make Zabbix alert if your restore environment Zabbix suddenly starts responding back something else than the regular login page, or if the restore environment database doesn’t come back with some query response you would expect.

Setup a HA cluster

As any hardware can die, it’s not a bad idea to set up a HA cluster. Last winter I was preparing for potential electricity blackouts here in Finland and did setup my laptop to be a secondary node for my Zabbix. This setup has been working very well.

Use strong passwords

Even if it would only be your sandbox environment where you do test new stuff, please remember to use strong passwords. An evil actor can attempt to breach more targets in your network through a single point of failure.

Use ssh keys

Instead of username + password combination, use ssh keys for ssh authentication. Keys are immune to brute-force attempts and with tuning, ssh keys can also be allowed to only connect from specific IPs and run only specific commands. You know how in your ~/.ssh/authorized_keys the lines do start with something like 

ssh-rsa aZfgT12b(....

but if you modify it to be 

command="/usr/bin/rsync" ssh-rsa aZfgT12b(....

well, then only rsync would be allowed.

Or, for IP address limitation

from="123.123.123.123" ssh-rsa aZfgT12b(....

Of course, these can be combined:

from="123.123.123.123" command="/usr/bin/rsync" ssh-rsa aZfgT12b(....

Obviously, this grants you much more security than traditional logins.

Use HashiCorp Vault

OK, I admit I’m not doing this at home as it would be overkill for my few logins. But, in a larger environment with absolutely critical safety requirements, use HashiCorp Vault for protecting your credentials. Zabbix has native support for it.

Monitor your logs

Setup a centralized log server — it can be your Zabbix server environment, too — and make sure you monitor the logs. My Zabbix gets all my logs, but wouldn’t be a bad idea to use more advanced log monitoring tools, too. Since I already do have ElastiFlow running at home, at some point I might start utilizing Elasticsearch for the logs. Not doing it much yet.

Use chkrootkit, AIDE, others

Tools like AIDE or chkrootkit can help you detect an intrusion. Set them to run in your cron and get alerted in case of any anomalies. Maybe I’ll one day integrate Zabbix with these tools.

Firewall your environment, use VPN

Don’t allow direct access from the Internet to your Zabbix, or your database, or anything really. In my case, my Asus router allows setting up OpenVPN connections, so that’s what I use. Whilst on the go, I just connect to OpenVPN on my phone and do whatever I need remotely through that.

Anything else?

Did I miss something? Let me know in the comments.

This post was originally published on the author’s page.

The post What’s Up, Home? – How to secure your (home) monitoring appeared first on Zabbix Blog.

Apple Is Finally Encrypting iCloud Backups

Post Syndicated from Bruce Schneier original https://www.schneier.com/blog/archives/2022/12/apple-is-finally-encrypting-icloud-backups.html

After way too many years, Apple is finally encrypting iCloud backups:

Based on a screenshot from Apple, these categories are covered when you flip on Advanced Data Protection: device backups, messages backups, iCloud Drive, Notes, Photos, Reminders, Safari bookmarks, Siri Shortcuts, Voice Memos, and Wallet Passes. Apple says the only “major” categories not covered by Advanced Data Protection are iCloud Mail, Contacts, and Calendar because “of the need to interoperate with the global email, contacts, and calendar systems,” according to its press release.

You can see the full list of data categories and what is protected under standard data protection, which is the default for your account, and Advanced Data Protection on Apple’s website.

With standard data protection, Apple holds the encryption keys for things that aren’t end-to-end encrypted, which means the company can help you recover that data if needed. Data that’s end-to-end encrypted can only be encrypted on “your trusted devices where you’re signed in with your Apple ID,” according to Apple, meaning that the company—or law enforcement or hackers—cannot access your data from Apple’s databases.

Note that this system doesn’t have the backdoor that was in Apple’s previous proposal, the one put there under the guise of detecting CSAM.

Apple says that it will roll out worldwide by the end of next year. I wonder how China will react to this.

Backups to the rescue!

Post Syndicated from Nathan Liefting original https://blog.zabbix.com/backups-to-the-rescue/23442/

In this blog post, you will learn how to set up backups for your Zabbix environment. There’s a wide variety of different options when it comes to taking backups of our Zabbix environment, for us, it will just be a matter of choosing the right fit.

 

Introduction

Monitoring is an important part of our IT infrastructure and often times when our monitoring isn’t working for a certain period, we feel like we are blind as to what is going on with our different IT components. As such, taking backups of our Zabbix environment is an important part of running a production Zabbix environment, as we do want to be prepared for a possible issue that might corrupt or even lose our data. It’s always a possibility and as such we should be prepared.

For Zabbix, there are a few different methods on how to take backups and it all starts at the database level. Both the Zabbix frontend as well as the Zabbix server write their data into the Zabbix database as we can see in the illustration below:

This means that both our configuration as well as all of our collected values are present in the same Zabbix database and if we take a database backup, we back up (almost) everything we need. So, let’s start there and have a look at how we can make a database backup.

How to

MySQL backups

Let’s start with the most used variant of Zabbix databases: MySQL and it’s forks like MariaDB and Percona. All of them can easily be backed up using built-in functionality like the MySQL Dump command and we can then use other industry standards to get things going. First, we have to understand the tables in our database though. Most of the tables in your Zabbix environment contain configuration data and as such, they are all important to backup. There are a few tables that we need to consider, however, as they can contain Giga or even Terabytes of data. These are the History, Trends and Events tables:

It is possible to omit these tables from your backup and make smaller, more manageable backups. To make the backup we can then start using tools like MySQL Dump:

Once we have taken a backup, we can easily import that back into our environment using the MySQL Import command or simply using the cat command:

Do not forget, taking and importing large backups can take a long time. This completely depends on your MySQL database performance tuning settings as well as the underlying resources like CPU, Memory and Disk I/O. Also, make sure to check out the MySQL documentation:

MySQL Dump:  https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html / https://mariadb.com/kb/en/making-backups-with-mysqldump/

MySQL Import: https://dev.mysql.com/doc/refman/8.0/en/mysqlimport.html / https://mariadb.com/kb/en/mysqlimport/

 

Alternatively, it’s also possible to create backups using tools like xtrabackup and mariadbbackup.

PostgreSQL backups

We can actually use the same kinds of methods for the PostgreSQL backups. Keep the required tables in mind and fire away with the built-in tools:

 

Then we can restore it by loading the file into postgres:

What about the configuration files?

Once we have a database backup, everything is backed up, right? Well, almost everything. With just a database backup we are quite safe, but (and this is oftentimes overlooked) there are a lot of configuration files and perhaps even custom scripts we need to take into account! There are three parts to this story – the Zabbix server, the Zabbix frontend, and also the Zabbix additional components. All of them have their own set of configuration files and locations that are used for storing custom scripts.

The Zabbix frontend location and configuration files can be different, depending on the environment, as we have a few choices to make. Are we running Apache or Nginx? On what Linux distribution? All of these have to be considered when making configuration backups. In general, the locations for the configuration would be:

/etc/nginx/
/etc/httpd/
/etc/apache2

There’s also a symlink to the Zabbix frontend configuration file located in /etc/zabbix/ but we will get to that one in a bit.

Then we have the Zabbix server itself, which keeps its configuration in /etc/zabbix/ and if we’re following best practices any script should be placed in /usr/lib/zabbix. So we need:

/etc/zabbix/
/usr/lib/zabbix

Let’s add them to the list and find a method to back up these files. Crontab is a built-in tool that we can use, but there are definitely other (perhaps better) solutions out there. Let’s add the following to cron:

I also added a find command here, which will serve as our roll-over or rotation toll. It will find files older than 180 days and delete them from /mnt/backup/config_files/. Make sure to pick a good (network) folder to store these files as it’s important to keep these safe. Feel free to change the number of days you’d like to store the files for.

What about the additional components like Zabbix proxy, Zabbix Java gateway and Zabbix web service (used for PDF reporting)?. Well, these have configuration files as well. Make sure to run a backup on the devices running these additional components. As for Zabbix proxies – they have the same file locations as Zabbix server:

For Zabbix Java gateway and Zabbix web service, we can omit the /usr/lib/zabbix/ folder.

Don’t forget the import/export files!

In general, database backups are slow to make, but also slow to import back unless we do not include the history/trends in the backup. But even then, restoring an entire database simply because someone made an error on a single template is a hassle. Zabbix ships with the built-in frontend export functionality, allowing us to export (and then import) entire parts of the configuration instantly! We can use these for a number of different parts of the configuration:

  • Hosts
  • Templates
  • Media types
  • Maps
  • images
  • Host groups (API ONLY)
  • Template groups (API ONLY)

All of these are available through the Zabbix API allowing us to choose whether we do a manual configuration backup from the frontend, as well as providing us with automation options using that API. You could even manage and update your Zabbix configuration from GIT entirely if you write the right scripts for this.

Frontend backups

To run an export from the frontend simply go to one of the supported sections like Configuration | Templates and select the export data format. When selecting multiple entities, keep in mind that they will all be exported to a single file.

We can then make our edits and import files from the frontend as well:

For Templates this will even result in a nice diff pop-up window, detailing all the changes, deletes and additions to the templates:

 

API backups

For the API things get a little more complicated as we need to select a mode of execution. Of course, it’s possible to do a curl command from the CLI or even use something like Postman:

Request body

The response will then look something like this:

But this feature really starts to shine once we combine it with our own automation scripts. Use it wisely!

High availability

So, what about high availability? Isn’t that some form of a backup?

Well yes and no. High availability is not an “IT backup” in the form of making sure we can recover something that is broken. But it is a backup in the way that if a Zabbix server instance fails, another one takes over for it. HA is somewhat out of scope for this blog post, but it’s still worth mentioning. There are several solutions to set up Zabbix as a full high availability cluster. For MySQL we can use a Primary/Primary setup, for the frontend we can use load balancing techniques like HAProxy and for the Zabbix server, we can use the built-in high availability method. Combine all of these together and you’ll definitely be able to serve your every (production ready!) need.

Conclusion

To conclude, there are many options to start taking backups of our Zabbix environment. It all starts at the database and these backups are definitely vital to keep things safe in case of disaster. When making the backups, do not forget about the configuration files and custom scripts as well as the frontend backup option. Combining all of these solutions will safeguard our environment, but if that isn’t enough – do not forget about industry standards like snapshots. Even further safeguarding our environment on multiple levels.

I hope you enjoyed reading this blog post. If you have any questions or need help configuring anything on your Zabbix setup feel free to contact me and the team at Opensource ICT Solutions. We build a ton of cool integrations like this and much more!

Nathan Liefting

https://oicts.com

A close up of a logo Description automatically generated

The post Backups to the rescue! appeared first on Zabbix Blog.