How do I Encrypt my Email?

A more astute question is why encrypt email at all?  In this day and age of FBI cracking iPhones and other government agencies bullying large data and global/mobile  communication providers such as Apple, Google and Microsoft, to name just a few. These cloud companies, and thousands like them, that rolled over for years providing meta data from phone records of private citizens and businesses until Libertarian supporter Edward Snowden broke the news to the world in 2013 that the CIA had been collecting, storing and disseminating these personal phone records pretty much at will. No one except for librarians, to whom we owe a debt of gratitude, seemed to have any understanding of data privacy until the Snowden scandal broke.

Jon-Lybrook-Cloudscape_no_1-web

The reason to continue to encrypt your email and data is because security is a process, not an end-game destination — though not having conventional security in place can mean end-game for your business reputation.  Securing your data privately, knowing all copies are contained, and not trusting the ambiguity of “the cloud” when you ask your provider where your data is really kept? Email encryption is not all it takes to ensure basic web and email security.

Staying out of any service with a storage solution simply described as “in the cloud” is difficult now. So much emphasis on public and private data clouds such as AWS, E2C, Google Drive, iCloud, etc, is becoming the only alternative. Just as Edward Snowden was one of tens of thousands of contractors with access to National Security data, what’s to ensure that one of the thousands of system admins at any of these cloud providers isn’t doing the same thing with your customer lists and competitive intelligence. How much can your really afford to trust “the cloud” anymore? Most of these huge providers have very little control or oversight of their system administrators who are usually instructed to do “whatever it takes” to keep the boss upstairs and clueless where he belongs, and the government snoops fat, happy with access to all your data, and out of his thinning hair.

android_fail

So here’s the good news: The options for securing your email are greater than ever. I still like this thread from the venerable “Ask Leo” website, which still provides the clearest example of one’s options to secure traditional email properly:

http://ask-leo.com/how_do_i_send_encrypted_email.html

As the article indicates, setting up proper security on the client side on multiple computers is often necessary to ensure everything is remaining encrypted between a sender and recipient. This can be alot to expect, especially if the person you are trying to exchange messages and documents with is not computer savvy.  Social workers and therapists are able to keep secure communications private when emailing with family members and clients with a web-based, point-to-point encryption solution. While many good secure messaging solutions exist, only WordSecure Messaging offers a server-based solution not in the cloud. It is also able to be fitted with your logo and personal login for you and your customers.

Mobile based secure texting is always hard to trust since we all know what happens on mobile devices — you take a screen grab of whatever is of interest, and back on the internet it goes!  The FBI, one of the worst agencies at computer science, was still able to crack the iPhone in under a few months, since the San Bernadino shootings in 2015. If the FBI can crack the most popular and encrypted phone on the planet so quickly, it is clear all mobile devices are a liability to security. Yet , you have to communicate, so, this is why server-based technologies are what we focus on in our business, and have since 2002. You can update most of the vital email security at once and not wait for users to update their systems to take advantage of security patches and enhancements.

Remember that whatever device or program you use to communicate with someone, ensure that it is appropriate for the kind of information being sent. Protected Health Information (PHI), for example, is any kind of electronic or written information pertaining to a person’s health, conditions, diseases, and medications. Even communicating that someone has an appointment with an oncologist could be considered a breach of PHI, which is punishable in civil and criminal prosecution of both companies and individuals!  This means employees could get sued personally for knowingly sharing PHI with unauthorized entities. Health information as well as financial data, passwords, and other types of Personally Identifying Information (PII) are best kept protected, encrypted and out of the cloud, whenever possible.

Jon Lybrook

“Keep where you keep your data protected and encrypted!” — Jon Lybrook – Tera Bear Consulting

 

 

 

Posted in Application Hosting, Secure Server Email | Comments Off on How do I Encrypt my Email?

FreeBSD vs. Linux

linux-bsd-644x250

On the surface, the GNU/Linux project and FreeBSD are both considered “open source” software which implies “free” to use.  In fact there is a lesser-known clause in the Gnu GPL that might be of interest to entrepreneurs and developers:

“Linux is available under the GNU General Public License (GPL), which is designed to eliminate closed source software. In particular, any derivative work of a product released under the GPL must also be supplied with source code if requested. By contrast, the BSD license is less restrictive: binary-only distributions are allowed. This is particularly attractive for embedded applications.”

This technically means if you use any GPL code, you may be asked to give up your own source code to your own work that integrates a GPL project, if requested.  This provides a clear legal, business reason for choosing FreeBSD over a generic flavor of Linux.

For more specifics on the differences between GNU/Linux and FreeBSD, check out the following links:

https://www.freebsd.org/doc/en/articles/explaining-bsd/comparing-bsd-and-linux.html

https://www.reddit.com/r/sysadmin/comments/3odphp/why_do_you_love_freebsd/

Posted in PHP Web Development | Comments Off on FreeBSD vs. Linux

Updating pkg FreeBSD 10.1

So, if you’re installing or upgrading to the current FreeBSD OS 10.1-RELEASE branch, you’ve probably heard…

The new preferred way to install open source packages is with the new package management tool, simply named ‘pkg’.  From the man page:

pkg provides an interface for manipulating packages: registering, adding, removing and upgrading packages.

Once you’ve got an account on a new FreeBSD system version 10.0 or later you’ll be able to ‘man pkg’ and see all the nifty options.  It was available before, but not necessarily installed by default as with version 10.0 and later (depending on which FreeBSD Branch you’re following:  Current (bleedy-edge development), Stable (kind of tested) or Release (mostly tested)) To me, the name Stable sounds like it’s production-worthy.  Not the case.  For the best tested and essentially, most reliable version of the OS out to date, conventional wisdom advises installing the Engineering Release Branch of FreeBSD).

To see which branch you’re running issue the command:

 uname -a

and you should see output something to the equivalent of:

10.1-RELEASE-p9 FreeBSD 10.1-RELEASE-p9 #0: Tue Apr  7 01:07:33 UTC 2015     root@amd64-builder.daemonology.net:/usr/obj/usr/src/sys/GENERIC  i386

One of the key differences in moving from FreeBSD 9 to FreeBSD 10 is the new command and syntax to install and deinstall programs.  It is simple from a user’s perspective.  For example, from your shell of choice doing:

pkg install perl5-5.18

Will install Perl version 5.18.  Similarly:

pkg delete perl5-5.18

…will delete it!  To see all versions of Perl you have available to install do:

pkg search perl

Upgrading a package is just as easy:

pkg upgrade perl

should do the trick.

To update ALL packages that can be updated:

pkg upgrade

If installing via ‘pkg’ doesn’t work for some reason and it bombs, you can resort to installing via the ports directories using ‘make’, e.g.

cd /usr/ports/lang/perl5.18
make config
make clean install

Don’t worry about pkg losing track of whether you used it or the method above to get less-versatile packages installed.  pkg tracks your installations, deinstallations and dependencies faithfully, whether you are employing the pkg command or using the more traditional ‘make’ method shown above.

Finally, there’s a recommended standard everyone at the FreeBSD project seems to agree on.  Before pkg FreeBSD was something of a quagmire of mixed messages as far as ports and package management.

The addition of increasingly reliable ports management systems like pkg is one of the many reasons Tera Bear Consulting relies heavily on the FreeBSD UNIX operating system with confidence.  It’s proven the test of time for us in the critical areas of reliability, scalability, and rock-solid, online security.

Powered by FreeBSD

Upgrading to FreeBSD 10.1 is a snap with the pkg package manger!

Posted in Application Hosting, FreeBSD, FreeBSD System Administration, Managed Dedicated Server Hosting, open source, Perl Development | Tagged , , , | Comments Off on Updating pkg FreeBSD 10.1

Package Update Purgatory on FreeBSD

Typically upgrades to a FreeBSD UNIX system are a breeze.  Packages are automatically checked for security updates that are emailed, letting you know they are available.  Occasionally things get messy however. and work-arounds must be employed.

There are a number of different package managers, and once you find one you like, you really want to stick with it.  At Tera Bear Consulting we use portmaster.

Recently I had an issue with an update to subversion where it wouldn’t get upgraded using our friend portmaster, however.  I tried, but it wouldn’t budge giving me the error:

>make: cannot open Makefile.

Yuck.  I tried uninstalling and installing the latest version with still no luck.  I tried what was suggested in /usr/ports/UPDATING which sometimes is key, but not in this case.  The package upgrade went smoothly on other systems so I knew there was nothing wrong with the update.  It had to be this system.  normally I use cd to the ports directory where the package lives and do make deinstall.  However, since the port had been deprecated it wouldn’t remove itself.  What finally worked in this case was:

 pkg_info | grep subversion

>subversion-x.x.x    Version control system

Where x.x.x is the version currently installed. pkg_info confirmed it was still installed and which version. To remove it, I did:

 pkg_delete subversion-x.x.x

With that gone I was still unable to install from the ports tree getting the same error about the Makefile above.  So I used the package installer:

 pkg_add -r subversion

This installed an earlier version of subversion than desired, but from here I was able to do:

portmaster subversion

which upgraded the system without a hitch.  No version is required when using portmaster.  It knows.  Now to verify:

pkg_info | grep subversion
>subversion-y.y.y    Version control system

Viola.  We’re up to date, and consistent.

 

Posted in FreeBSD System Administration, portmaster, UNIX | 638 Comments

Fixing Problem with Postgrey 1.34_7 under FreeBSD 9.2

Most of the time FreeBSD documentation is complete and more than anyone could ask for.  With the release of Postgrey 1.34_7 this was not the case.  I’ve been running some form of greylisting on my mail servers for almost a decade now, and while fast and simple in what greylisting does, there is virtually no error handling built into it.  While commendable that the maintainers have opted to move the startup script into a more conventional framework, the manual changes required to get this program to run were mostly undocumented and led to hours of trial and error and searching the internet for a solution.  Hoping this post will help some sysadmin somewhere until the problem is addressed by the FreeBSD package maintainers.

20130525:
AFFECTS: users of mail/postgrey

The RC script for postgrey has been modified. If you use the  default value for postgrey_flags this does not affect you.

If you have postgrey listening on a Unix socket or set any optional  values, please read the comments in the RC scripts and check your settings in rc.conf prior to restarting postgrey.

 

I’ve been using the same procedure for several years and I use the default values, or so I thought.  No matter what I tried, postgrey would fail to start with various warnings but no direct clue as to what was wrong:

> service postgrey start
> Starting postgrey.
> Pid_file “/var/run/postgrey.pid” already exists.  Overwriting!

Checking the processes with ps -waux |grep postgrey made it clear nothing was running in spite of the message above.

Reading the comments in the new startup script in /usr/local/etc/rc.d/postgrey made it pretty clear what needed to happen:

# Add the following lines to /etc/rc.conf to enable postgrey:

# postgrey_enable (bool)        Set to ‘YES’ to enable
#                              Default: NO
# postgrey_dbdir (path)         Location of postgrey database files.
#                               Default: /var/db/postgrey
# postgrey_flags (extra args)   Additional command-line parameters.
#                               Default: –inet=10023

The value command_args in the postgrey startup script contained values which seemed redundant to what was already in postgrey_flags in /etc/rc.conf.  Indeed, adding these values to postgrey_flags /etc/rc.conf caused duplicate parameters to appear in the process.  The startup script would fail for want of params in the rc.conf, but adding them to rc.conf caused redundancies.

>ps -waux |grep postgrey

>postgrey  9278  0.5  0.3  22088  11204 ??  SsJ  12:05PM 0:00.02 /usr/local/sbin/postgrey –pidfile=/var/run/postgrey.pid –inet=127.0.0.1:10023 -d –x-greylist-header=X-Greylist: delayed %t seconds by postgrey-%v at %h; %d -d –pidfile= –dbdir=/var/db/postgrey.pid

The solution was to add all the parameters to rc.conf and comment them out of the startup script like so:

Add to /etc/rc.conf so postgrey will run:

postgrey_enable=”YES”
postgrey_greylist_header=${postgrey_greylist_header:-“X-Greylist: delayed %t seconds by postgrey-%v at %h\; %d”}
postgrey_pidfile=”/var/run/postgrey.pid”
postgrey_flags=”–pidfile=${postgrey_pidfile} –inet=127.0.0.1:10023 -d  –x-greylist-header=${postgrey_greylist_header}”

Comment out the following lines in /usr/local/etc/rc.d/postgrey to prevent redundancies:

#pidfile=/var/run/postgrey.pid
required_dirs=${postgrey_dbdir}
#command_args=”-d –pidfile=${pidfile} –dbdir=${postgrey_dbdir}”
stop_postcmd=”rm -f ${pidfile}”
run_rc_command “$1”

Bingo.  Finally a running postgrey process!

postgrey  9278  0.5  0.3  22088  11204 ??  SsJ  12:05PM 0:00.02 /usr/local/sbin/postgrey –pidfile=/var/run/postgrey.pid –inet=127.0.0.1:10023 -d –x-greylist-header=X-Greylist: delayed %t seconds by postgrey-%v at %h; %d -d –pidfile= –dbdir=/var/db/po

Posted in FreeBSD, Greylisting, Postfix, Postgrey, UNIX | 664 Comments

Backups, Backups, Backups

One of my first jobs in the computer industry in the mid-90s was that of a Computer Operator. Now it sounds like a funny job title since, after all, who doesn’t operate a computer nowadays? It was, and still is, a specific role in a large production environment where mainframe computer systems and industrial-grade computers are still in use.

As a Computer Operator for the Geological Society of America, I would have to know how to process and retrieve data from files and databases in a variety of media, including 9-track tape devices, the kind you see in 1950’s science fiction movies and old IBM documentaries depicting computers.

There I learned the three most important jobs of computer operations which are, in order of importance:

Backups, backups, and backups.

Like most things boring and mundane — exercise, doing yard work, and taking vitamins, to name a few — performing regular backups at least weekly, and as important, verifying the backups, can keep you out of trouble.  You don’t need to be a company as big as Google to see the necessity of this task.  Human error and data corruption happen all the time and while the time spent rebuilding a computer system is not trivial, it is fixable.  Your precious data such as photos, documents, and scans, on the other hand, may not be recoverable without good backups.  The best way to ensure that data is available in the event of a catastrophe is to have a whole copy of it.

One theory behind “Cloud Computing” is that multiple copies of your data exist in various locations at once and are continuously being updated.  While this sounds cool in theory, the reality is copying data like that is not only less secure, making more copies vulnerable to hackers, but there is strong likelihood your data will fall out of sync at some point. I wouldn’t rely on any single technology any more than I’d rely on any single individual or entity when it comes to backups.

There is the old chestnut story about the backup operator who was diligently making the same backup on the same backup tape week after week.  The data had been deleted by someone two weeks prior.  So, of course having a backup from one week ago didn’t help.

Ideally you will do a complete backup of your critical files at least once a week and rotate them out to one or more offsite locations every week. Having four different drives is not uncommon, as well has keeping an annual or semi-annual backup just to be safe.  External USB drives are ideal for this backup method. Once you establish a system test it periodically and always check the log files to ensure what you think happened actually did happen.

“Mission critical” computer systems are often built on a RAID system.  RAID stands for “Redundant Array of Inexpensive Disks” (depending on who you talk to).  This means your data and/or operating system is spanned across two or more disks so that if one hard disk fails, the others are able to continue to work to access the data.  This is a type of fault tolerance to ensure systems stay running and data remains available, but it is no substitute for having data backups.

We had at one point a web system that was running a RAID supporting both the OS and the data, but when one of the hard disks became corrupted it simply populated that corruption to the rest of the system.  Luckily the system was able to limp along while we rebuilt a new system from scratch and transferred the good data over to it from backups. The system was up and running at full capacity again in a matter of hours.

Like most teenagers supposedly don’t learn respect for the destructive power of an automobile until they impact something, so most computer users don’t respect the power of having backups on hand until there’s a problem and their data is lost.  They are too busy and important to be bothered. Magical thinking. Don’t be that guy.

Also, don’t rely on anyone else to have backups of your data.  Your IT person or webhost may promise they are backing up your data, but if something goes wrong and they didn’t do it, excuses will matter little.  You having access to your website and computer data at-the-ready is really the only solution.  As the aphorism goes, “If you want something done right, you’ve got to do it yourself!”.

And always remember the three most important things in computer operations:  Backups, backups, and backups.

Posted in backups, Google, Uncategorized | 34,598 Comments

Google / Verizon Creates Email Problems on Android OS 4.3

Android Emali ProblemsA client visited me the other day at my request.  He had gotten a brand new Android phone from Verizon, running the latest OS 4.3, and could not connect to one of my company’s secure email servers using it.  I thought that was strange considering my wife had only recently gotten rid of her Android but had been able to connect for years. I wanted to see for myself what was going on.

The client had called Verizon tech support three times earlier in the week.  One person told him that only Gmail and AOL accounts could connect through the Android, but yet they banged their head against the wall with him nonetheless, trying in vain to get it to work as it should. Surely Gmail and AOL weren’t the only email services available on the Android?  I mean, it’s open-source software run by the “don’t be evil” guys… Right?

The Secure POP3 settings, ports, server name, username, and password were all correct.  Yet the error message persisted:  “Error connecting”.  We tried manual  and auto-detect.  Invariably the android software tried to make up names of the mail server that didn’t exist, such as pop.myserver.com and the like.  I’d change the settings, and Android would put them back to the incorrect settings again while I chased my tail, just as my client had been doing for weeks.

I double-checked all the settings and even looked at the mail logs to see the device, in fact, wasn’t even connecting to the server.  We rebooted the phone.  These email problems on Android 4.3 persisted.  We Googled it.  Nothing was documented that I could find about Androids not being able to connect to standard POP3 mail servers.

Finally I decided to jump ship and have him install a 3rd Party mail client for Android called K-9.  In less than a minute he was connecting to the server, checking and sending mail.

What gives?  Could it be that Google and/or Verizon has made it so no servers other than the one’s they decree can use the default mail client on Android?  Sure seems that way.  It’s not enough that both have the lions share of their respective businesses worldwide.  Apparently they also need to sweeten their share even more by frustrating end-users and ultimately causing the majority of them who have no interest in troubleshooting these kinds of email problems to throw their hands up in the air and resort to only using Gmail.

More and more it appears Google and/or Verizon are blatantly turning to the sleazy side of the force, protecting their market share at the expense of end-users’ acquiring brain damage trying to make things work as they have in the past.  This is almost identical to how Microsoft behaved back in the 90s out of fear of the competing browser Netscape – breaking things so users relent and only use their product.

One shouldn’t have to bail on the default mail client of any OS just to get it to work.  Yet that was the solution in the case of Google Android.  Hoping this saves someone time.

A5DDNE34M8ZA

 

Posted in Google, Uncategorized | 905 Comments

Why does a Web Developer Stoop to Using WordPress?

As a web developer, whenever saddled with a project – whether it’s for my own company or someone else’s, the first thing I try to do before writing a line of code is to write down all the requirements and search  the web to see if a premade, open source solution is available.

Open Source is a catch-all phrase which refers to any software where the underlying code is explicitly given to the world at large and not copywritten by the author, with the stipulation that while it can be modified and used in any way (commercial or not), it may not be modified then claimed to be owned by someone else. While there are often specific limitations in some licensing agreements for computer code that is considered open source, that’s the general idea.

So why would a web developer stoop to using WordPress? The reason I use WordPress, in addition to customized content management systems we’ve written at Tera Bear Consulting, is the same reason I don’t write my own database software and rely on MySQL — why spend time reinventing the wheel?  Granted, there are sometimes very good reasons to write a custom application such as when the requirements are very specific, or when there is proprietary information at stake that needs to be protected.  Oftentimes the security of open source software is also superior to that produced by freelance developers, many of whom are not security-focused in their approach.

I once had someone I knew talk smugly about people who used the Apache webserver and how he didn’t need to since Python allowed him to write his own so quickly and easily.  I didn’t care to get into a discussion with him about the fact that the Apache project has been in development for nearly 20 years and has more useful addon modules than Microsoft has parking spaces in Redmond, Washington.  I don’t get into debates about tools. Everything was written with a purpose, whether it was to do something new or do something better than the other guy.

We released Pro Artist Websites around the same time that WordPress was first released in 2003 with the aim of it being one of the first, truly flexible, easy-to-use website management tools.  While now it is not as functional as WordPress when it has all its extra addons properly installed, it also doesn’t break every time the main application is updated, unlike alot of 3rd party tools many WordPress users have come to rely on.

Pro Artist Websites also doesn’t stall out while waiting for an AJAX component to complete a connection since we’re using mostly server-side dependencies and not relying on the client’s browser to do the heavy lifting.  Writing code to be as server-side dependent as possible is the old-school approach that we’ve tried to adhere to when the requirements of the project allow for it because it is generally a more stable approach.

The more flexibility and features a given web application provides, the easier it is for one developer’s carelessness to break the site for thousands of users.  It’s the same reason we choose FreeBSD Unix over Linux, and why I shun the Google Android in favor of the Apple iPhone, but I’ll save those topics for another blog post.

Until then, aloha and happy surfing.

Posted in CMS Software, Custom Content Management Systems, Managed WordPress Hosting, open source, Software as a Service (SaaS), Uncategorized, WordPress | Tagged , , | 22,009 Comments