Thursday, January 24, 2008

The Jibber-Jabber of the Spambots

One of the greatest things about the Internet has to be spambots. Oh, don't get me wrong, I hate spam as much as everybody else, but the folks who program these spambots have to be commended for their level of ingenuity.
In order to avoid detection, these spam generators produce nonsensical paragraphs composed of random words to allow them to slip past your filters, or at least confuse the bejeezus out of them. Every once in a while, though, you get one that produces almost avant garde material. Witness, if you will, this little ditty received in my ancient Yahoo account...


Notice how it arranged it like a poem. The real message was an image that Yahoo's filters stripped out, leaving this bizarre text. Here is that section of the email that contained the "poetry"...

Just high a today.
Line day us knew good means our.
Large their kind came school answer.
Thing back since air.
School go often since.

At until often across hard.
Point any life name parts.
Should were another called near means top a.
Large something important ways up he going far of.
Next which himself had since did through.
While way might can her.

Important both many soon show top air.
Took being all like.
Until also last since thing enough three father.
There before sea i get sentence right.
Line today enough means against way.

Far like hard whole last may see why.
After well know then a near live picture why.
With sea small should show get before life she.
I through far since.
First below want line.

Men best miles make men many each.
Again may a find which was white.
Did change i off these picture years like.

Reads like James Joyce. On acid. I'd love to take credit for this, but somewhere out there is a spambot that is churning out prose of a surreal variety. Impressive.

Sunday, March 25, 2007

Gadgets Versus Computers

Not too long ago, Craig Barrett of Intel derided MIT's OLPC, the "$100 Laptop" as a "gadget" and went on to say that "gadgets" weren't overly successful. That is certainly pause for wonder, and completely inaccurate. Gadgets, in actuality, are wildly successful. In fact, many are basically handheld computers with more power than systems that were in vogue just a few years ago (an, in fact, are still in use in many areas). But let's look at definitions.

WordNet (http://wordnet.princeton.edu) defines "gadget" as...
"appliance: a device or control that is very useful for a particular job."

Whereas Wikipedia defines computer as...
"...a device or machine for making calculations or controlling operations that are expressible in numerical or logical terms. Computers are made from components that perform simple well-defined functions. The complex interactions of these components endow computers with the ability to process information. If correctly configured (usually by programming) a computer can be made to represent some aspect of a problem or part of a system."

That said, computers are, in essence, gadgets. So, Intel makes CPU's for most "gadgets" that we call "computers". Really, this is a case of semantics, and in actuality Mr. Barrett was drawing attention to the fact that the OLPC is not a computer in the early 21st century Western definition of the word. When we think of computers today, most of us think of the Windows XP (or Vista or any number of alternative operating systems) machine that we have sitting on our desk or that we lug from place to place (perhaps "lug" isn't a good term, either; early "luggable" computers were beast that frequently topped 20kg). In addition to doing complex calculations (which is what computers were truly designed to do), we use them for word processing, graphics, entertainment, communication and data storage. These functions were once handled by a variety of items but can now be handled by a single device. Believe it or not, home computers have been capable of these functions pretty much since the beginning. Even machines from the mid 1980's are capable of these. Let's look at one such computer, the Apple Macintosh Plus from 1987 -

Processor - Motorola 68000 8 MHz
RAM - 1 to 4 MB
Storage - 800 KB Floppy (Options - Internal and external), external SCSI
Display - 512 x 342, 9" black & white, 2-bit
Ports - 2 RS-422 serial mini DIN-8 for printer and modem

Admittedly, by today's standards, not impressive. However, it can still do many functions that modern computers are capable of (including accessing the Internet, albeit in text only form). It just isn't as modern. The games will be simplistic, you may be able to play CD's with the right software and an external CD drive and, if using only floppies, you'll have limited storage abilities. Still, it is, in a sense, modern, and is definitely a computer. Now, compare this to a "gadget" I have hanging on my belt right now, my Palm IIIxe -

Processor - Motorola Dragonball EZ 16 MHZ
RAM - 8 MB
Storage - 8 MB (same as above)
Display - 160 x 160, 4" black & white, 2-bit (actually, 4-bit grayscale)
Ports - RS-232 compatible serial (proprietary configuration) & IrDA
In many ways, it is the equal of the Mac Plus, though only considered a "gadget". When hooked up to my GoType keyboard, it is even closer still to being a "computer", and not a "gadget".

Again, admittedly, this is all semantics. But, just as a 1918 Ford Model-T is still as much a car as a 2006 Chevy Corvette Z06, my Palm IIIxe is every bit a computer, just not a very sophisticated one.

Friday, June 17, 2005

Sublime Tech

I'm sitting here listening to the Alan Parsons Project's 1977 classic "I, Robot". Only recently did I obtain another copy of this album, my first CD copy in fact. The last time I listened to it prior was sometime in 1998. I always associated this album with not just its science fiction theme, but with the future in general. It's funny that for seven years it was no longer in my life.
As I listen to it, I find myself thinking about how pervasive technology has become in our lives now. After all, here I am typing this in WordPad on a Windows XP machine (I prefer Macintosh, but that's another story), something that most of us see as common place. Not everybody has computers, though, and they think somehow that they have managed to avoid the"blight" of technology.
Think again.
Since the digital genie was released from its bottle, it has entered almost every aspect of our lives in one form or another. To think that we can do without it now is to be misleading. Even if we don't own any new technology and choose to live as the Amish do, the effects of technology are always there. Imagine where we would be in a world without technology... that's actually difficult. Whenever we do anything, newer technology is there.
Computers making calculations regarding weather patterns for months in advance, helping farmers know what to expect. Wristwatches with alarms and multiple settings. Televisions that are capable of disallowing certain programs from being viewed. Even if you completely shun computers and choose to live a life that resembles the 1970's, you will not escape tech. For whenever you listen to the radio, hoping to catch that ever elusive Pink Floyd song, you can bet that the station is not using an old analog tape copy, but a CD.
Ultimately, the best kind of technology is the type that sits in the background, or better, the underground, not making us aware of its presence. In-your-face technology isn't really the way to go for most of the masses. Like digital Morlocks, these technologies hide from plain site.
But let's not allow those same digital Morlocks to devour us. Technology is best when it is sublime.

Wednesday, June 08, 2005

An Odd Victory - The Triumph of X86 Architecture

It was the news that everyone dreaded. Apple announced on 6th June 2005 that, by the end of 2006, the company will have transitioned to Intel x86 chips. What happened?
Steve Jobs is known to be a bit moody. His company has been held hostage by the makers of his CPU's. This is the very heart and brain of the computer. First Motorola and then IBM failed to deliver the promised improvements in the chip design. Speed was never really gained, and it appeared as if all PowerPC development has stalled. Meanwhile, the folks at Intel have been steadily moving along with the x86 series chips, most recently the Pentium 4. Speed has been steadily increasing to the point where the 3 gigahertz barrier was passed with ease. This from what is still essentially a 32-bit chip.
The biggest advantage to be gained in CPU development is market share. The more market share you have, the higher your profits and the more money you can drop into R&D. That is exactly what Intel did. They weren't very interested in radical improvements to the architecture. They were more interested in speed and capacity. That philosophy is exactly the reason that the 3 GH speed barrier was broken; don't do anything radical, just keep finding ways to milk out more speed. In the end, this approach has been wildly successful.
IBM, on the other hand, simply didn't sell enough PPC processors to Apple. That has to be the bottom line. The money wasn't there for R&D. Forget all of the promises. They failed to deliver. Besides, IBM will not miss the departure of Apple. They have more than enough business from the growing game computer market, which will far outsell anything Apple can sell.
The move to Intel will, hopefully, be a smooth one. Jobs & company plan on a full year to implement the changes, and during the initial run-up, there will be a new form of "fat" binaries, software that will work on both the PPC and x86 architecture. There are other advantages to be had by the move to Intel, such as the much rumored Mac Tablet; surely, such a beastie will use something along the lines of a Centrino processor.
Still, the transition is a sad one. PowerPC held so much potential. Instead, the spoils of the CPU war go to the old soldier, the one that can trace its lineage back almost directly to the first 8-bit processors. Sometimes, old soldiers don't die. Sometimes, they win.

Monday, May 16, 2005

In All Things (Tech), Moderation...

A good friend of mine is more convinced than ever that our modern world is leading humanity down a primrose path to failure. His biggest concern is that technology is making us lazier. In that aspect, I think he's right. In the past couple of years, I've found myself swinging between luddite and technologist. The best of both worlds is the middle path; use technology, but do not let it use you.
The world we live in is going to move forward whether or not we ourselves are active participants. In my lifetime alone, we've gone from few households with televisions to most American households owning computers (no doubt that has to stick in the craw of former DEC president Ken Olson; way back in 1977, he said "there is no reason for any individual to have a computer in their home". DEC has been gone for a while, incidentally). The march of technology is in one direction, and that is more of it.
One can choose not to participate in it, if one chooses. This approach works for many individuals and groups. This is surely their right, one that I will gladly defend. However, if one wishes to live amongst the great mass of Americans and modern civilization, it is important to have at least a basic appreciation of the fact that it is here, like it or not.
What I'd like to suggest is moderation. I'm a firm believer in useful personal technology. By useful, I mean to the individual. It is not my place to tell the individual what they need. When considering useful tech, you must think about what you need. It's like groceries. While that 14 ounce porterhouse may look good, just how much of it can you eat? Is it worth the price? Wouldn't it be better to just settle for sirloin if that is all your budget will allow?
Certainly, the technology companies would prefer that you always buy the latest and greatest; that's how capitalism works. Is it worth it, and do you really need it? In my case, I certainly attract a lot of attention when I write on my old Palm IIIxe and GoType keyboard (both five years old), or more so when I use my beloved Tandy Model 102 (where this is being written, 17 years old!). Could I use a newer computer? Yes. Why? These machines are more than capable of the task I use them for. If I wanted to play the latest generation of computer games, then yes I suppose that a newer laptop would be a good idea. But I rarely play computer games of any sort (guess that makes me a fuddy duddy). The newest computer I own is already five years old, and I got it because it was being tossed. For my needs, the equipment I use is good enough. Why use a tank to crack a nut?
Again, though, it comes down to the individual. You have to decide what works for you. If you feel that there is too much technology in your life, try living without it for a while to see if you're any happier. If you are, then you've found your answer. If not, then you know how it effects you. As I've written in another column, and is oft quoted, all things in moderation. You do not have to let technology govern you. In our world, it exists for us.

Tuesday, May 03, 2005

It Seems Like A Good Idea, But...: Municipal vs. Business Supplied WiFi

There is a very strong socialist bent in some of my ideas from time to time; not that I'm a socialist or anything. One idea that has always appealed to me is the idea of free or cheap Internet access for lower income people. Before the Internet boom in the mid-1990's, that very idea was met with boos and hisses from the online providers at the time. Keep in mind that at that time, most services sold time by the hour; it wasn't unusual for me to get AOL bills of $70USD or more. The market drove the prices down, and eventually the idea that municipalities provide service seemed passé, especially when you have ISPs such as United Online with monthly fees of $10.
Now, we've entered the latest phase of digitally connecting the world, wireless or WiFi. And, not surprisingly, the whole municipal versus business-supplied service has reared its head again. But this time, it's gotten nasty. For instance, this news item from Yahoo News, "Cities Face Backlash As They Plan Municipal Wireless Services" gives you the essential lowdown on this fight.
This is not the first time that this issue has come to my attention, but whenever I read news like this, the first thing that crosses my mind is "here we go again"; big business stands to lose money in this, and is taking a legal approach to stop the competition. The flip side is, of course, how will these cities finance these projects? The reality is that it costs less to set up wireless networks than to lay DSL cable; the money in the wireless business is almost entirely concentrated in the service end, just as it was in the early days of digital communication when you used a modem and Ma Bell's pre-existing lines for connecting. That the providers would feel threatened is only natural, since they are essentially selling air. There is money to be made in this, and they know it.
There is room for both public and privately supplied WiFi. This wrangling smacks of dirty deals in backrooms. If a city decides to offer this service to their citizens, they should put it to a moratorium, and if the voices rise up and say they're for it, that should be the end of it.

Monday, May 02, 2005

If You Can't Lead...

Bill Gates has decided that the next generation X-Box is going to be a media hub...

Next-Generation Xbox to Be Media Hub

There are many out there who feel that the MacMini comes very close to this concept already. In fact, it doesn't matter the platform; any well implemented home computer can be tasked with being a media hub. That was the basic idea behind the 1990's "Multi-Media Computer" concept, I believe. However, the greatest thing that the X-Box has going for it is its price, as well as legions of diehard followers. No doubt that this will be a success, but only because it's already been around.
Also, Mr. Gates made what some feel were disparaging remarks about Apple's Tiger OS. As I read them, they just seemed to fall flat. If you want charisma, go to Mr. Jobs... of Apple Computers.


Because I'm Such A Technophile

In searching for reasons to start a second blog, it seemed obvious to me the best thing to do was to just be honest; I love good technology; I am a technophile. This sounds like the sort of thing that a "Computer Junkies Anonymous" member might confess to, but here there is no twelve step program. Here, I want to talk about what I consider good, usable technology. I doubt that I'll talk gaming systems, though they are a major moving force in technology. Instead, it's usable tech as previously stated, little gadgets that make our lives easier... or more complicated, depending on how well they're implemented.
From my own experience as an end user, I think that it's very easy for serious tech types to forget that there are people out there who only long for things that are simple, easy to use and durable. In that area, I'm a bit luckier than most in that I've worked in a variety of tech environments and around a number of operating systems, even serving as a tech as needed. Ultimately, it came down to this; does it work, or doesn't it?
Thanks to those things that did, I've become a techno-junkie... err, technophile.

Hi, my name is Rob, and I'm a technophile.