2012-08-09

The magic button — Make Everything OK

The magic button — Make Everything OK
The sad thing is that there are way too many people who think that such magic actually exists and therefor they don't have to do the hard work needed to reach their goals.  sigh.  but still fun to push.

2012-07-04

Modern Togetherness

A couple, each working on their own things in the same small room, with music on headphones to tune out the distracting environmental sounds, staying in touch with instant messaging, even though they could just turn to touch the other (they do some of that too :). 

2012-06-29

Link rot

sigh. I just went through a round up updating one of my websites, and a part of that was just dealing with link rot ( http://en.wikipedia.org/wiki/Link_rot ). Now I understand whole websites going away when the sponsor ceases to exist, but what is so frustrating are the sites that keep shuffling everything around every few years for no apparent reason, "to improve the user experience" is just too lame an excuse for most of the changes.  Worst yet are the retirement of data just because it is 'old' (aka Age discrimination) as if nobody ever keeps products running past their design life, or that newer versions actually migrated the excellent 1st generation documentation in the the current sets.  I have made use of 20 year old manuals as they were the most comprehensive for what I was doing even if they missed some of the recently added features.  I make a point of trying to keep all my links consistent so that even when the root domain has been switched out from under me, they rest of the links can still be used at the new location.

I try to follow Tim Berners-Lee advice in http://www.w3.org/Provider/Style/URI as much as possible on this front.

2012-06-19

Assumptions: All computer operators are programmers

As a technical person that came through the DOS/OS2/NetWare/Windows path, it has been an interesting set of challenges learning to install and support Linux systems.  One of the biggest has been that many in the open source world that Linux is a part of, is that everyone who isn't a total end-user, must be a full programmer.  This is so clear in much of the documentation (man/info pages) as it is in many of the forums/chat groups. 
An example is the documentation for the Linux 'free' command which tells you all the switches for the command, but without any indication of what the output of the command really means, because it is supposedly obvious.  I think I have it mostly figured out at http://www.konecnyad.ca/andyk/freemem.htm, but that is only through lots of Googling and finding blogs that actually go into it to explain different points.
In too many of the forums, I've seen people trying to learn being told that it was simply obvious and beneath requiring an explanation, or that the answer was a tiny bit of programming and therefor 'simple'.  This sounds like the arrogance you see in academia, like a physicist that showed that the physics of a fusion reactor was proven and it 'only' needed some engineering to finish (over 5 years ago and we still don't have that reactor design adding power to the grid that I can tell). 
I would say this is one of the biggest obstacles to Linux getting anywhere near mainstream, because so many supporters keep treating mainstream as not worthy of the support they need.  I can only think that many of them crying for mainstream acceptance are really just elitists who really don't want to share. But I will keep pushing to break those barriers as I believe that open source is still the most efficient way forward for the bulk of IT.

2012-01-30

Gamma testing

While software is being developed, it goes through the well known steps of alpha testing (all in house) and beta testing (invited customers through to anyone).  What is less well known is the next logic step, that when the software is released to customers (i.e. the developers start charging for it), can be referred to as gamma testing.
When a piece of software has a major release, there are all kinds of pressure to "get it out the door" so that the vendor can start making a return on their investment. From the customer perspective, they can now start piloting this new (version of) software in their own environment, during which they sometimes find major problems that should have been caught in the beta testing.  This can make using brand new software a bit risky, which is why I often refer to that first while as gamma testing.  Gamma testing can be just the first month of the release on products that turn out to be running well out the gate, where as for the more troublesome products it might not be until the first service patch has been released.
Gamma testing is certainly the time where we find out if the product is the greatest yet for its kind, or if it is likely to be a dud (i.e. Windows Vista, Microsoft Bob, even a few service packs some to mind) and a proof that the phrase "the latest and greatest" often used incorrectly as if the latest is always the greatest and is also greater than the versions that are yet to be created.
And why has this come to mind now you might ask.  Well I have to decide for a client whether some new servers to be built next week will be the old, stable, but with known problems version of the OS, or the shinny new one that was release a week ago.  Now back to watching those OES forums.