The Humane Society of the United States

Updating a GoDaddy Website with rsync

Though GoDaddy.com hosting supports commandline access via SSH including use of scp and sftp commands, it does not support rsync.

How do you update a website without rsync?

After much scripting trying to duplicate a small subset of rsync's functionality, I've recently discovered a way to rsync files to a GoDaddy.com website.

With ssh, scp and sftp, GoDaddy.com has all that's necessary on the receiving end to support SSHFS (Secure SHell File System) mounts.

SSHFS is built on the FUSE userspace filesystem and so, if installed on your system, should allow any user to mount devices with it. This lets us transfer files over a secure, encrypted link using rsync since rsync thinks it's doing a local copy from one directory to another. The cp command as well as all of the other Linux utlities are likewise at our disposal.

Here's how I used it on my system...
  1. I created a mountpoint directory
    mkdir /ABCwebsite
  2. Then added an entry to /etc/fstab
    sshfs#user@abcwebsite.com:/home/content/a/b/c/abcwebsite.com /ABCwebsite fuse defaults,_netdev 0 0
  3. Then mounted the remote GoDaddy.com filesystem
    mount /ABCwebsite
  4. Then rsync'ed the desired directory
    rsync -lDz0 /root-of-dir-tree-to-upload /ABCwebsite/dir-to-upload-to

The rsync switches used are...
  • l - Copy symlinks as links instead of copying the file linked to
  • D - Recreate devices
  • z - Zip(compress) files for transfer
  • O - (capital O) Omit setting timestamps
    This prevents the "failed to set times" error that occurs when the owner on the remote website does not have the same user number as the file owner on the local host the files are being sent from, which will always be the case when uploading to GoDaddy.com.


See Also

Labels: , ,

Software Building Codes?

My experience has been that good developers haven't given up the fight and are struggling mightily every day to be allowed to use standards and techniques known to improve software quality.

Today's obsession with rapid development however, often results in admonishments for wasting time and labels of "unproductive". The eventual result may be their replacement with more eager to please folks that will slap something together in order to meet arbitrary deadlines established without any real-life assessment of time required to do the job right.

The same would be true in the building industry if there were no building codes, which attempt to enforce a minimal level of safety and reliability on structures. Yes, there is corruption of that effort. Both in the inclusion of codes to shutout competitors and in the ignoring of codes in order to increase profits. Nothing's perfect.

Robert Reich, economist and former U.S. Secretary of Labor, recently (March, 2011) commented about government regulations in general that, though many say those regulations may kill jobs, he points out that a lack of regulations kills people.

When government does something that increases costs, the main problem for the business is usually that those costs were not factored in to the business's budget. They couldn't be since they were unknown when the budget was being developed. That's why many laws with obvious economic impact are set to take effect at some point in the future in order to give businesses a chance to plan for them.

Software errors cause tremendous economic losses to businesses and individuals regularly. According to the the National Institute of Standards and Technology, flawed software cost the U.S. economy $60 billion in 2002. What no one wants to admit however, is that software errors kill people.

NASA's tragedies are just the most visible examples. Government and medicine are two fields with lots of blood on their hands from software errors though few realize it and even fewer will admit it.

Too much is demanded of most workers today for them to be able to make decisions or do their jobs unaided by computers. We must acknowledge that computers have become as integral a part of the team as any human members. The nature of the computer's role is such that there is rarely any practical way to manually verify that it is correct or override its actions.

If we are forced to rely upon computer software for life-saving care and life-threatening decisions, doesn't it need to be at least as minimally reliable as our homes and offices?

Perhaps it is time for some kind of software building (development) codes.

Resources

First published by Robert C. Watson on PragProg as "Software Development Codes?", 08/06/2004. Updated and revised here 08/01/2005 and 03/25/2011.

Labels:

Proprietary File Formats

When it's named like a Docx, looks like a Docx and is used like a Docx... it's a Zip?

I was nearly finished with a post about the dangers of storing files in proprietary binary formats vs. universally readable ASCII text, which had been prompted by my frustrations trying to read a .DOCX file someone sent me. That's when Google finally brought me to an article that reminded me that Microsoft Word(MS) 2007 .DOCX files are really .ZIP files consisting of several .XML files.

Initially being unable to read the file since I'm still running MS Word 2003 (which hasn't a clue about Word 2007's DOCX format), my journey of discovery had, until then, begun to resemble a trail of tears.

At one point I loaded the file into a text editor to get an idea of its true contents. A DOCX file is completely unreadable this way. DUH! Its a compressed ZIP fi
le.

Instead of moving all code into a .HTML file, Microsoft chose to break the file up into even more .XML files, put them in a 3-level subdirectory structure and ZIP the whole thing. While ZIP has long been an open and widely used packaging and compression format which makes it tremendously better than Microsoft's proprietary DOC files, it's still a single point of failure in the technology required to read information.

By contrast with an HTML file, everything is in printable ASCII text, even the formatting codes. Thus, even without a clue about the meaning of those codes, the actual information could still be easily extracted.

Once you change the file's extension from .docx to .zip and unzip it, the various .xml files, manifest, etc. are in plain ASCII text so this is an improvement. Of course, the vast majority of Microsoft users do not have the time, interest or knowledge to do this so it's just one more headache to be endured as they try to share files in their quest to get their jobs done.

Microsoft's solution to this dilemma is their Compatibility Pack which updates Word, Excel, etc. but it requires you to have applied all critical updates to office before installing it. After being burned a few times with Microsoft updates that crashed critical programs I was using, lost data or introduced bugs worse than the ones they fixed, I turned off automatic updating. The backlog is quite large by now. By all the virus scanners and things I run though, I don't have the security problems Microsoft tries to scare everyone with either. Gee... I must just be incredibly lucky I guess.


By making it the default save format, Microsoft dramatically increases the market penetration of the new format as only techies are likely to change the default to the old format to prevent these problems. And of course it generates lots of support revenue from people trying to read the file they were just sent and must act on yesterday.

Thus, the new format provides no advantage or improvement or "openness" to the public at all. It simply is another way for Microsoft to lock customers into their expensive products. They called it "open" to appease the EU assuming that no decision-maker there or business person here had the technical knowledge to see through their charade.


Labels:

Reliable Software

We've known for decades how to prevent software bugs...

  1. Design from the top down and build from the bottom up.
    Only reuse modules that are completely debugged and reliable. Code built on unreliable code will be unreliable no mater how perfect it is.
    (Here I use "module" generically to mean any identifiable block of code... variously called "subroutine", "method", "procedure", "function", "macro", etc.)

  2. Don't allow any module to have side effects.
    Side effects cannot be documented sufficiently to make them fully known for future work. They are thus inherently unreliable.

  3. Software development is non-linear. Plan For It!
    An "80/20" rule is far closer to reality than the linear projections forced upon most software projects. To produce reasonably reliable software (we're not even going for "bug-free" nirvana here), about 80% of the development time will be spent on the 20% of the code at the bottom -- the lowest level -- the first modules upon which everything else is built.

  4. Large groups cannot produce good software.
    Practically perfect software can only be produced by a team of... one. No one can wait long enough for one person to build the huge systems used today though so we have to sacrifice some perfection for the timeliness that teams can achieve. Teams of up to around 10, where each member excels in a different discipline and is responsible for a well-defined component of the project (i.e. GUI, business logic, database, testing, cat herding, etc.) can work well. Cat herders (leaders) unthreatened by more technically skilled team members can be hard to find though. We need to cultivate more of them.
Software development as it exists today, with its total obsession with speed of development, is untenable.

In business, software is the embodiment of a company's competitive strengths. How can a company that does everything just like their competitors hope to best them?

Government and non-profits are highly dynamic and diverse in their missions. Every software project is thus unique. How can they afford to pay the premium of profits on inferior products with their comparatively modest budgets?

Commercial software products today are much like America's luxurious but highly unreliable cars of the mid-twentieth century.... no longer affordable.

Open source projects and in-house development build on a much greater depth of knowledge of the processes being automated and thus produce more reliable and more productive systems.

Can we afford not to change?

Labels: , , ,