February 29, 2004
RFID Tracking Concerns Lawmakers
If you're not yet concerned about RFID (Radio Frequency Identification) tags, you should be. These tiny devices can be included in many consumer goods. As such, there is growing concern about how they can be associated with individual consumer information and even tracked in public places by strategically-placed readers. Wired News has a good summary in "Lawmakers Alarmed by RFID Spying", which reports on several states' attempts to enact legislation. Not so coincidentally, Wired also reports German protesters have similar concerns.
Personally, I probably wouldn't care too much if someone knows I bought a pair of Levis, with the possible exception that I wouldn't want to get spammed by people trying to sell me more jeans (as least now I'm mostly protected by my state's Do Not Call List -- that has been a huge blessing from relentless and rather pesky telemarketers). Regardless, I'd categorize that as more of an annoyance. Now let's take it one step further: As one person commented here previously, it could be used to present personalized on-demand advertising, a la Minority Report (and this is also mentioned in the first Wired News story above).
However, after that unique identifier gets associated with me, readers in public places could track my whereabouts. My concern is once the genie is out of the bottle, where will it end? Several years ago, I posted to one of the legal tech listservs that online data collection (e.g., cookies, spyware, etc.) could eventually be tied into the brick and mortar companies' databases and the crossover effects would be chilling. Not too long after that, DoubleClick tried to do exactly that. Fortunately there was much public outcry and DoubleClick adolescently stated they were very sorry and wouldn't do it again. Suffice it to say, there are still serious public trust issues.
This quote pretty much sums it up: "'Some lawmakers now say that RFID tags in retail items may further erode consumers' privacy. "There is clearly an upside for the industry,' said Massachusetts state Sen. Jarrett Barrios, 'but underlying that is a burden borne by the consumers. It's unnerving to me that the companies have no incentive to protect consumer privacy.' " Sure, consumers can vote with their wallets and try to boycott merchandise with embedded RFID tags. That may work in the beginning, as a few select companies get scorned by consumers. But what happens if the manufacturers and retailers decide to tough it out until most items on retail shelves and in online stores have them? In my humble opinion, under that scenario consumers would have little choice but to succumb to the situation and buy them under protest if there are no other reasonable alternatives.
Thus unless sellers bow to public outcry, the free market model may not work in this case: "RFID technology is a surveillance tool that clearly can be misused, said Barry Steinhardt, director of the Technology and Liberty Program at the American Civil Liberties Union. 'To protect consumers, we need laws, not unenforceable policies," he said.' "
But what laws should we enact? Should RFID be banned outright? Should it stop merely at "truth in labeling" so consumers can make informed choices? Do we borrow a page from the online privacy debates to implement "opt in" vs. "opt out" strategies, and thus attempt to allocate who should bear the the burden that way? Or something different altogether? Certainly RFID has legitimate uses for inventory control. Somehow my gut tells me that none of the above will be the best solution, or worse, that there may not even be one due to the polarization that has already occurred. Only time and a lot of public debate will tell.
[Update 3/1/04: Techdirt has an interesting post on the potential for an RFID blocker tag. Apparently, researchers at RSA have begun demonstrating how the blocker tag works. As I mentioned above, I doubt a purely legal approach will adequately resolve the many RFID issues. As RFID is partially a technological problem, some creative technological approaches may help.]
February 27, 2004
Getting More Out of RSS Feeds
Whether you're new to RSS feeds (RSS = Really Simple Syndication or Rich Site Summary), or have been using them for awhile, there is a huge universe of information out there. Whether you're an RSS consumer or active content producer, there are many useful tools of which you're probably unaware.
On the consuming end, both AbbeNormal and Weblogs Compendium have compiled large lists of news aggregators. These include online web services and a host of programs listed for Windows, Linux, PDAs (including iPods), Mac, and even Tivo platforms. The AbbeNormal list even includes many feed converters (e.g., eBay-to-RSS, RSS-to-E-mail, etc.).
For RSS publishing (e.g., we bloggers), Robin Good has published an extensive list of Best Blog Directory And RSS Submission Sites. While entitled the "RSSTop55", it currently lists over 60 sites for aspiring syndicators to use. Just as one traditionally submits a new web site or blog to the popular search engines, these are generally sites at which you submit your RSS news feeds in the hopes of expanding your reach.
Now if you really want to dive deep into the potential and "next gen" use for RSS feeds, then I recommend "The Birth Of The NewsMaster: The Network Starts To Organize Itself", which discusses how RSS can be utilized to create a NewsMaster, a function of being able to find and aggregate desired information and manage it more completely. Put in his words, it is "the ability to concert, orchestrate, edit, and refine quality search formulas that tap into the whole RSS universe and beyond, and that filter out relevant content based on selected keywords, sources, type of content, ranking and many other possible criteria."
Now I'd say that's something of value in this infoglut world we've created.
February 26, 2004
Last Day for ABA TECHSHOW Early Bird Savings
Today is the last day to qualify for up to $200 off the ABA TECHSHOW 2004 registration price. Register today and receive a $100 early bird discount. If you're also an ABA Law Practice Management (LPM) member, you'll get another $100 off the conference price. I'm an LPM member and the $40 section membership pays for itself several times over -- with the TECHSHOW discount alone. Plus the LPM Magazine is chock full of great articles and ideas. You may have noticed I feature a permanent link to the section's e-zine, Law Practice Today on my home page, as another great resource from the LPM section. In my humble opinion, that's getting a lot more than $40 worth.
TECHSHOW will be held March 25-27, 2004 at the Sheraton Chicago Hotel & Towers, and is the one of premier national legal technology conferences. As always, I'm really looking forward to it. It's my distinct pleasure and privilege to have been invited back to speak again this year. In addition to the high-caliber presentations and vendor exhibitions, it's a great place where you can get up close and personal with some of the world's brightest legal technology talent. Beyond the CLE, I've often received some of the best advice at the many social functions and vendor hall encounters -- and it's a lot of fun to boot. As a practical matter, if you can take home even a few new ideas that positively impact your practice, then it's well worth the time.
It's also not surprising that a number of speakers are also well known blawgers, and I'm looking forward to catching up with them along with the many other great speakers: Bob Ambrogi, Mike Arkfeld, Larry Bodine, Erik Heels, Dennis Kennedy, Rick Klau, Tom Mighell, and Ernest Svenson. Their blogs are well worth visiting and have a permanent home in my news aggregator.
Hopefully I'll see you there!
February 25, 2004
Bluesnarfing: Serious Bluetooth Security Flaw
First there was Bluejacking, which was more or less harmless pranking via Bluetooth-enabled cell phones.
Bluesnarfing, on the other hand, is much more serious. (Don't look at me, I didn't make up these names -- ironically Bluesnarfing is closer to real Bluetooth hijacking.) CNet News reports in this article how a number of Nokia cell phones are the most susceptible.
Bluesnarfing is a security flaw in Bluetooth implementations in which an attacker exploits it "to read, modify and copy a phone's address book and calendar without leaving any trace of the intrusion." "According to Nokia, if an attacker had physical access to a 7650 model, a bluesnarf attack would not only be possible, but it would also allow the attacker's Bluetooth device to 'read the data on the attacked device and also send SMS messages and browse the Web via it.' " Furthermore, Nokia stated "that its 6310i handset is vulnerable to a denial-of-service attack when it receives a "corrupted" Bluetooth message."
As Dana Carvey would probably say, "Well now, isn't that special?" Wireless convenience just inherently introduces more security issues.
Per AL Digital, the security company that discovered the flaw, it affects some Sony Ericsson, Ericsson, and Nokia handsets. However, the Nokia 6310, 6310i, 8910 and 8910i phones are at greater risk because they invite attack even when in "invisible mode". FYI, in invisible mode, "the handset is not supposed to broadcast its identity and should refuse connections from other Bluetooth devices." Whoops.
I've been a big fan of Nokia phones. Compared to others I've had, their business class phones have been generally more rugged and have better sound quality. I've even read posts from Nokia owners who've run them over with their car, put them through the washer and dryer, and they still worked. However, until Nokia provides a fix for this, I'm going to stay away from their Bluetooth phones as a precaution. That's the real shame, as Bluetooth was just finally beginning to deliver on much of the hype we've heard over the past several years.
Just What Did We Learn from the "New Economy"?
Fast Company's March 2004 issue features an interesting look back on the dot.com New Economy, and the lessons learned for moving forward.
For instance, this was perhaps the hardest lesson for most new online businesses to learn:
"Boom-Time Buzz: Move first--or die.I liked this quote most of all: "There have always been advantages to being the first in a market, New Economy or no New Economy. The key is knowing there's a true need for a product, and being able to respond when a competitor jumps in after you. 'If you do it first and you do it right, you can win pretty big,' says Kevin O'Connor, the cofounder of DoubleClick. 'But it's much better to do it right than first.' "
February 23, 2004
Easy SpamAssassin Tips That Work
A LawTech Guru feature article by Jeffrey Beard
If you're using the popular SpamAssassin software to deal with spam, or perhaps considering its use, here are some firsthand tips written in plain English to improve its effectiveness:
SpamAssassin was included in the base monthly price of my web host provider, one of the deciding factors for choosing them. Between June and November, SpamAssassin did an incredibly accurate job of flagging spam with virtually no false positives (less than a dozen misflagged legit e-mails in 6 months). SpamAssassin does this by analyzing each e-mail for certain traits and then assesses a differently weighted value for each trait found. Then it adds up these values, and if the total exceeds your chosen threshold, it flags it as spam.
Since SpamAssassin had done a great job, I left the original default settings alone. In December, my experience changed dramatically. Suddenly, roughly half of incoming spam messages were scoring below SpamAssassin's default threshold of 5.0. Luckily I wasn't seeing any false positives (legit e-mail being moved into my Spam folder), but I had to wade through a lot of spam left in my regular Inbox. It appeared spammers crafted messages that fell under SpamAssassin's default settings radar. I didn't want to reduce the threshold score because some valid e-mail was scoring in the 4.x range. I'd rather err on the side of having some spam in my Inbox than filtering legitimate e-mails into my Spam folder. However, I missed reading several important messages in my Inbox because they were buried in the surrounding spam.
At first I chalked it up to the holidays -- spammers were going all out during the big spending season. But it didn't relent in January or February. That's when I decided to take things into my own hands. I called my host provider's tech support, which has been exceptional on technical matters. Surprisingly, both the first level rep and supervisor were pretty clueless on SpamAssassin, and suggested I head on over to SpamAssassin's web site for better documentation. I was disappointed there as well. Armed with the suspicion there had to be more people using SpamAssassin with similar problems, I went a-Googling.
I quickly located information on enabling SpamAssassin's RBL checks (Realtime Blackhole List, a blacklist of servers used by spammers), as well as its Bayesian features for better spam identification and classification. I found it easy to do, and it took only 20 minutes. The immediate results over the past several days is very encouraging, although quite preliminary: Out of more than 100 total spam messages received, all but five were properly identified as spam, and I had no false positives. That's a far cry from the 10-25 spams previously left in my Inbox each day.
Enabling SpamAssassin's RBL checks resulted in spam originating from known open relays (i.e., mail servers that allow spammers to send mail through them) receiving a substantially higher total score -- for example, 8.7 instead of 2.7. As mentioned above, anything scoring 5.0 and higher gets filtered into my Spam folder via a simple rule in my e-mail program. [Please Note: The corresponding risk with using RBL checks is that legitimate e-mail coming from blacklisted servers may be improperly flagged as spam because of this trait.]
So now you know the "Why" and my preliminary results. Here is the "How" for making desired changes, and it's not difficult:
At Lunarpages.com, I have two easy ways of changing my SpamAssassin user settings. The first is by using their web-based Control Panel, under Mail, then under SpamAssassin. The other was adding the desired changes to the text-based "user_prefs" file via an FTP upload to my server. The Catch: Either method requires one to understand the settings, syntax, and the best way to select them.
That's where the SpamAssassin Configuration Generator site came in most handily. My web server is running SpamAssassin version 2.63, and the SA Config Generator site works with versions 2.5x and above. As the site states, "This tool is designed to make it easier to customize an installation of SpamAssassin with some common options. After you answer the questions below, a SpamAssassin configuration file matching your choices will be displayed, and you can download it and use it with your SpamAssassin installation." The best part is that it not only lists some of the most useful SA features and their options, but actually explains what each setting does.
I entered my choices into the web form, and it generated the following SpamAssassin setting file for me:
The big changes above were the "skip_rbl_checks 0" to enable RBL checking (don't you just love double negative syntaxes?), and the two Bayes settings.
After that, I downloaded the original default "user_prefs" file from my web server via FTP so I could edit it. Windows Notepad, while primitive, is more than sufficient for the quick copy/paste task. If you want a more full-featured text editor, then I strongly recommend TextPad. I retained all the original text for future reference (commented out by preceding "#" characters), pasted the above text into the bottom of the file, and saved it. It was then uploaded via FTP to replace the original.
To double-check the settings actually changed, I went into the web-based SpamAssassin Control Panel, and sure enough, all of the new settings were displayed. Alternatively, I could have manually entered the above settings into LunarPages' web-based Control Panel and skipped the FTP file transfer. If you are running some type of SpamAssassin plugin program locally on your PC instead of a web server, odds are that the text-based settings file is stored on your local hard drive.
Lastly, I expect everyone's mileage will vary, as we all have a different mix of e-mail messages. I also plan to monitor the true effectiveness of these setting changes over a longer period. However, it was quite empowering to be able to combat spam on my own terms and see immediate results. While somewhat cryptic at first, the SpamAssassin software was fairly easy to tweak with a little self-help. Perhaps best of all, I didn't have to go purchase one of the many commercial anti-spam packages or services, as it was already included in my low monthly web host fee.
I prefer using SpamAssassin because frankly, I've never liked the various "whitelist" spam services. Why should I make friends and business colleagues jump through confirmation hoops when the problem is on my end? Not exactly my idea of customer service. Likewise, there will always be some people who won't perform the confirmation process, so their e-mail would otherwise be blocked from me. So I prefer to let spam through as long as it's flagged and managed appropriately. I'm also dramatically increasing the odds that I will see the important messages that were previously buried amongst the flotsam.
As a parting tip, if you're looking for a good free FTP program without included adware, then I heartily recommend LeechFTP, which has many features and has worked extremely well for me.
February 20, 2004
Dissatisfied Employees are Job Surfing at Work
"Tell off an employee before noon, and there's a good chance that he or she will be back at their desk after lunch searching for a new boss on-line," says Monster.com's founder, Jeffrey Taylor. Yesterday he spoke at a conference in Toronto of the Human Resources Professionals Association of Ontario, as reported by GlobeTechnology.com.
Apparently the most popular time is between 2 and 3 p.m. on a rolling time zone, according to Taylor. Workopolis.com's president, Patrick Sullivan, cited 10:30 a.m. Monday mornings as the busiest time for his site. He added, "People come back to the office on Monday, after a nice weekend, and say 'I think I'll look for a job.' "
Mr. Taylor told his audience of human resources managers what I've been reading all over the Internet and in print: That employers should not assume that they have a captive work force just because the unemployment rate is hight. With the Internet, workers have easy access to opportunities and know whether they are marketable.
"You should treat [employees] like gold," he told the audience. "Employers who do not treat their employees well, he said, risk losing their best, not just 'the C-players . . . they might want to lose.' "
While virtually no one is expecting a rapid turnaround of the employment market, many experts are predicting that with the retirement of the baby boomers, there could be a serious shortage of American employees by 2010. (Of course, that's if we don't see a corresponding shift overseas due to global outsourcing.)
So in a nutshell, employers who think they have a captive audience are most likely just kidding themselves, and are focusing too much on the short term gains which will be more than lost when the human intellectual property eventually leaves for greener pastures. In my opinion, little things like appreciation (expressed both verbally and monetarily), employee enrichment programs, and fostering self-empowering environments really do go a long way -- especially in fast-paced, high-energy drain positions with long hours. Am I describing anything familiar to the legal market?
Flat Panels Predicted to Outsell CRTs
CNet News.com reports, "[f]or the first time, global shipments of liquid crystal displays in 2004 will surpass those of cathode ray tube (CRT) units, market research firm IDC said Thurday."
It chalks the result up to the abundance of flat panels, which has in turn driven prices down to be more affordable for mainstream users. "By sometime next year, 17-inch LCDs will dominate the market, according to IDC. "
[Link courtesy of Gizmodo].
Top 10 Smartphones
About.com lists its picks for the top 10 smartphone models. Not surprisingly, the Treo 600 once again tops the list. While I don't necessarily agree with the rest of the ranking order, it's still a nice listing of the top smartphones available today, if you're in the market for one.
[Link courtesy of Gizmodo.]
February 18, 2004
Some RSS & Atom Observations
First, I'd really like to thank the many people who took the time to post both the original comments and a lot more over the past few days. My intent in posting was to summarize and help inform fellow blawgers as to the issues relating to RSS and/or Atom news feeds -- and why this is important.
From all this, I have the following impressions, observations, and suggestions on the subject, which of course are purely subjective on my part:
1) RSS has been and is working well for bloggers, especially if your blog only has one author per post (i.e., the "Simple" in Really Simple Syndication). However, from the examples given, it appears to me that RSS may not be as smooth a fit in some collaborative authoring and commercial settings due to the need for more advanced features.
2) RSS is perhaps best described as a "de facto" standard by virtue of its wide adoption and use.
3) Freezing the RSS core has probably helped its adoption as a de facto standard, as it's easier to hit a stationary target. It also has contributed to much concern about moving forward with it.
4) It sounds like RSS features can be added in extensions, at least to some extent, although this is one of the hottest areas in debate.
5) Atom development sounds like its trying to be more things to more people, compared to my observation #1 above re: RSS. However, its rapid change and perceived increased complexity are also hurting its mind share. In this regard, it has the opposite problems of RSS as mentioned in observation #3.
6) Atom is aggressively attempting to be RSS with the extra bells and whistles, or at least the next evolution. In other words, its developers seem to want it to be the "One Ring", for better or worse, to supplant RSS. Having just one standard is preferable from my perspective, but right now it's very difficult to say which one that should be.
7) Given the rate of Atom support among many popular news aggregators, it's definitely something to keep an eye on.
8) Not having an RSS feed today means one is missing out on some very substantial opportunities to extend a site's traffic and/or reach.
9) At present, I'm still foggy on the tangible benefits gained from including an Atom feed on a typical blog that already has one or more RSS feeds (i.e., "typical blog" defined for this purpose as one maintained by a single author, although I realize this is debatable in of itself). Other than appearing more technically savvy and "with it", I'm not seeing how adding an Atom feed by itself will translate to more traffic or reach. Eventually, it may translate to providing a better "reader experience" by offering more choices to one's visitors, but that remains to be seen. On a larger or more collaborative blog or web site, it appears that Atom brings some additional features to the table worth exploring.
10) RSS and Atom developers/supporters need to focus on overcoming their current challenges and not the personalities and personal attacks. Neither format is perfect, and neither bloggers nor their readers want to be caught in the middle of another standards jihad, akin to the Betamax/VHS and DVD -/+ format wars. I'll generally agree that competition is a good thing, but splintering of standards is not.
I'm not opposed to the Atom format development. However, I really need to see the significant benefits it would add today in exchange for my investment of time and effort. In other words, what's the ROI for the average blogger or web site operator? This is where Atom developers need to spend considerable time to get the word out in plain English and gain the necessary mind share. Many bloggers, while ahead of the curve, are not going to understand what a namespace is. Put it in terms and context of how it will affect them where they live, and get this out into the mainstream media channels.
Again, I very much appreciate everyone's participation on the subject, and additional comments are welcomed. RSS technology has made it far easier to both obtain and control our daily information overload. In this regard, it's a useful but double-edged sword, and it remains to be seen which path it should cut.
February 16, 2004
RSS vs. Atom Continues
Dave also posted a link to an Atom translator to RSS. So, here's my question: If, as its supporters are saying, Atom is the way to go for news feeds, why would one need to translate it back to RSS 2.0? The only answer that immediately came to my mind is that RSS is so much more prevalent currently. Following this line of reasoning, an Atom-only site would be missing out on distribution opportunities if it didn't provide an RSS feed. I noticed that several comments mentioned that Atom is a moving target. To me, this seems to be both its greatest weakness and strength, and as a result I have a better understanding of this hotly contested debate.
Again, I'm not choosing sides, but I understand Dave Winer's reasons for freezing RSS -- it's easier to implement a stationary target, especially if it's already working for the masses. On the other hand, what happens eventually when a new need comes along? As we've all seen, web technology doesn't stand still. Will RSS be sufficiently extensible and open to change as new needs pop up? I certainly don't have the answers, but I'm pretty good at asking some of the tough questions.
I do think that some form of news feeds is here to stay for awhile yet. A month after I launched this blog, I noticed that over 42% of the hits were by RSS readers. Since RSS feeds don't authenticate (at least not here), the net effect on end readership was something less than that. Nevertheless, it is practically relevant.
Both camps obviously recognize that news feeds add value and readership. In their latest versions, a number of popular news aggregators are now supporting both RSS and Atom. So other than Atom supporters stating that Atom is more "open source" and not frozen, what are the benefits to us bloggers? Why should we choose to implement Atom on our blogs right now, or perhaps later down the road? In other words, if we do it, will they come?
February 13, 2004
The Great RSS vs. Atom News Feed Debate
CNET News.com reports that "Google's Blogger service is bypassing Really Simple Syndication in favor of an alternative technology, a move that has sparked more discord in a bitter dispute over Web log syndication formats." Instead of the RSS feed capability previously offered in Blogger Pro, Blogger is now exclusively supporting Atom for blog content syndication. Goodbye RSS for new Blogger users. While there are similarities between RSS and Atom, the developer community is getting pretty heated up about the debate between these two specifications.
Last year, CNET's special report on "Battle of the Blogs" provided a good explanation of the underlying debate. Basically, Dave Winer, who is credited with much of the development behind RSS 2.0, had frozen its core development "to keep the developers from screwing with it," so that it was kept "simple". This didn't sit well with others, so they decided to come up with their own flavor of blog content syndication, which along the way has been named Pie, Echo, and now Atom.
The problem is that while RSS and Atom are more alike than not, they are competing specs that could splinter the market. A number of bloggers have posted that RSS was really for web site content syndication, while Atom is geared toward blog syndication. There are many news aggregator programs and web site services that work with RSS, but very few will read Atom at the moment. Upon doing a quick Google search, I discovered that BottomFeeder is an open source news aggregator client that runs on many different operating systems (Windows, Mac, Linux, Unix, etc.) and supports news feeds in both RSS and Atom formats.
While RSS isn't going away (at least not any time soon), Atom is trying to be more things to more people. RSS proponents are concerned as to what a competing standard may do to splinter the marketplace. After all, for quite a few years, if you wanted to burn DVDs, you had to choose between buying a DVD-R/W or DVD+R/W drive and cross your fingers that the DVDs would work on all of your equipment (e.g., DVD player, laptop DVD drive, desktop DVD-ROM drive, etc.). Only fairly recently have dual-format burners become popular to ensure consumers could use their burned DVD's in the way they were expecting to use them. Thus I foresee that if Atom picks up more momentum, we may see more dual-format news aggregators like BottomFeeder on the market.
Atom proponents are stymied by the freeze on the RSS core, because they see that there is much more that RSS is capable of doing and becoming. Some say that on one hand, the ability to further develop RSS in the Atom format (rather than stagnation) is a good thing, but it also adds to its complexity. That is precisely why some RSS proponents want to keep RSS frozen -- to keep it simple so that it doesn't take expensive consultants and programmers to deploy it. In other words, it may not be perfect, but right now it's simple enough and works well enough that the masses can use it. It's not hard to see the logic on both sides of the debate, but unfortunately, it's become personal for some of the key players. There's been name calling and other less-than-productive approaches taken, which only serve to cloud the issues.
Even before I created this blog, I saw the unique value that RSS news feeds bring to both content providers and their reader audience. Now I and many other bloggers are faced with the decision whether or not to add and support Atom-based news feeds. If the blogging software vendors start including Atom support out-of-the-box similar to the way that Movable Type included RSS support, this may not be so bad. With any luck, it should just be another button link on my blog pages. However, right now I just don't have the time to go out of my way and manually integrate Atom support -- especially since Atom isn't all that prevalent yet. However, its backers are working very hard on a proposal for the Internet Engineering Task Force (IETF) to assume responsibility for Atom, which would in effect make it a standard. If Movable Type and other mainstream blogger developers add seamless Atom support in an upgrade, that could be doable.
Google's recent decision is interesting in of itself. For a long time, the standard Blogger software didn't include any RSS support, which is why they lost bloggers to other systems like Radio Userland, Movable Type, and TypePad. Now, after Google's acquisition, they've gone exclusively with Atom support. Is Google crazy, or crazy like a fox? I certainly haven't chosen any side yet, but I have to admit my concern over RSS being frozen. Emerging technologies have a hard time emerging when they're not allowed to evolve. Apple tried to keep tight rein over their specifications, and it made them the market leader of a 10% market for many years, while the PC platform flourished. Notice that I'm not saying that one was "better" than the other, but rather notice the effect that strict control had on its adoption.
In the interim, these developments bear watching to see which syndication standards are appropriate to support on one's web site or blog. While RSS is the clear leader right now, I still remember the days when most people thought Betamax would be around forever as the clearly superior format to VHS. Such is the nature of emerging technologies. The moral of the story is that it's definitely too soon to tell, and there may be room for both standards as long as the context is appropriately set. Given the intensity of the debate so far, I think it's safe to say we're in for more colorful developments before it's over.
[2.13.04, 11:51am - Correction: A number of newsreaders are now compatible with Atom feeds. The AtomEnabled beta site lists the following: NewsMonster, NewsGator, FeedDemon, NetNewsWire, MacroMedia Central, NewzCrawler, BottomFeeder, Shrook, Feeds on Feeds, Bloglines, WinRSS and Pears.
February 12, 2004
Enterprise Software Buyers Are Fighting Back
While reading CIO Today's "Software Buyers: 'We're Not Taking It Anymore'", and all through writing this, I can't help but hear the refrain from Twisted Sister's "We're Not Going To Take It" rock anthem resounding in my head.
From working in a large firm and dealing with numerous software vendors' tactics, this article likewise struck a chord with me. To sum it up, a recent survey found that enterprise software buyers are unhappy with the way vendors charge for licensing their products. But that's not the real sore spot, per the article: "They are downright furious about the cost of maintaining them -- not to mention the sometimes sneaky policies that vendors have established for compelling enhancements and upgrades."
"AMR Research queried several hundred executives and found that many feel mistreated and betrayed by their vendors. More notable, companies are not accepting the status quo -- that is, whatever the vendor dictates -- anymore. "In fact, customers are beginning to react in ways that have important implications for the entire application market," the report says."
When asked which actions they were going to take in the next 12 months, executives replied:
"38 percent were training internal personnel to provide technical support;
I've personally used many of these approaches with software companies who issued minor changes disguised as full version upgrades, issued no upgrades whatsoever during the annual maintenance period, or just plain didn't fix their chronic problems. Naturally, the larger your organization is, the more clout you generally have with vendors. In the legal market, the top 200 firms have plenty of muscle when dealing with software companies, and smaller firms sometimes find they have more influence than they first thought.
On occasion, I've patiently tried to help the vendors understand the customer's point of view. Generally, we don't want 20 new bells and whistles that "look cool". Business users want good, stable, and reliable software that meets their everyday needs. Most users don't use more than 10% of its features anyway, so why add more things they're just not going to use, which bloat it up and cause more DLL conflicts, resource and memory issues, training and support challenges, and integration problems? The problem is, vendors can't charge much, if any, for these desired stability enhancements because they should have been there in the first place -- at least from the customer's perspective.
On the flip side, what's not being said is that customers have contributed to their own problems as well. Poorly planned implementations, lack of training, and a host of other shortcomings will generate mediocre results from even the best software. So this isn't a rant against the vendors. Rather, it's an indicator that in the current economic climate, organizations are being tasked with getting more out of what they already have, and therefore don't have as much elbow room for less accommodating software or its vendors. Thus those vendors who make the extra effort to truly understand their customers' needs (hint: ask them!), develop creative pricing structures accordingly, and stand robustly behind their 'wares just might succeed where others fail. I find it interesting, no, telling, that I could very well say the same thing about attorneys who want to keep their competitive edge with their own clients.
While there is tremendous pressure to keep IT costs under control, I'll let the software industry in on a little secret: Savvy business users aren't afraid to choose the more expensive option if it will best meet their needs, has a cost-justified ROI, and perhaps most importantly, comes with superior customer service. Cheap isn't always better, even though it may look good on a budget spreadsheet. However, with that said, customers are always looking for bargains, and don't like hidden surprises. I've learned long ago that a satisfied customer may only tell a few friends about you, but a dissatisfied one will tell many more, and much more loudly as we've seen in the CIO Today article. Thus a little sweat equity with your customers goes a long way.
February 10, 2004
More on Microsoft Metadata
Back on January 6th, I reported the release of Microsoft's "Remove Hidden Data add-in for Office 2003 and Office XP".
With Microsoft's track record, I was somewhat skeptical that such a free utility would live up to its hype. With that in mind, I cautioned:
"I mentioned the readme file so that savvy users could compare its functionality to other metadata removers on the market. Although it's free, I strongly suggest that you make sure this tool removes everything you need it to remove. If it doesn't, then I recommend obtaining a program that will do the necessary job rather than rely upon this free utility. Otherwise, it could create a false sense of security, which when relied upon can cause many of the same problems as not using a metadata remover at all. Still, if you do not currently have a metadata remover and use the Office XP or Office 2003 suites, then using this add-in is probably better than the alternative."
Microsoft recently posted "Known issues with the Remove Hidden Data add-in for Office 2003 and Office XP". Also, Microsoft's Knowledge Base Article 834427 provides more information on the types of data this add-in can remove.
Therefore, it's up to each person to decide whether or not this tool properly suits their needs, and how it stacks up against leading programs such as Payne Consulting Group's Metadata Assistant for Word, Excel and PowerPoint. If the Microsoft tool removes what you need it to remove, then it may be worth using. The problem is that many people are just not tech savvy enough to know how to determine this -- thus my caution about false reliance on a metadata remover. My best advice is that whenever you can achieve it, as a general rule, Word document files do not contain revision and other metadata after conversion to HTML and PDF files. If you must share or send MS Office files, then make sure it is properly cleansed before sending. As part of one's due diligence in this regard, I believe a bit of in-house testing is required. If you don't know how to do this, then I heartily recommend engaging someone who does, such as Donna Payne.
As a good example of why we need to understand and care about metadata is this intriguing article by Preston Gralla. Mr. Gralla, a noted technology author, outlines how savvy privacy experts were able to debunk a supposedly valid high-level U.K. intelligence dossier about Iraq to be little more than a "cut-and-paste job" from three publicly available articles, one of which had been written by a postgraduate student in the U.S. I've also read similar approaches being used on college research papers and even attorneys' briefs to see who really wrote them and how much editing time was involved (cut-and-pastes take much less time than actual drafting) compared against the time billed.
February 08, 2004
14 New PDAs and Smartphones
Despite a maturing handheld market, the manufacturers just keep cranking them out. Brighthand just posted "A Look Ahead at Upcoming Handhelds and Smart Phones". It summarizes and reviews 14 brand new PDAs and Smartphones, which are expected over the next few months. There's quite a variety of devices and operating systems, and many more are offering built-in cameras, higher resolution displays, and yes, even Wi-Fi in more affordable units.
Seeing this many new devices reminds me of that Doritos commercial with Jay Leno, who wisecracks, "Crunch all you want, we'll make more."
February 07, 2004
Now *That's* a Lot of Data...
John Tredennick cites a report from the UC Berkeley School of Information Management and Systems, which makes some conclusions about how much data was created in 2002. Assuming this is accurate, we are collectively cranking out some serious data, most of which is not being stored in paper form.
The accompanying chart helps to illustrate exactly how huge this data pile is. Ever try to figure out how much a Terabyte, Petabyte, or even Exabyte means in a measure that's comprehensible to mere mortals? Here's one good example: 200 Petabytes represents all printed material. (A Petabyte is 1,000,000,000,000,000 bytes or 1015 bytes.) While that is certainly mountainous, it doesn't hold a candle to the total volume of information generated in 1999, which is 2 Exabytes (an Exabyte is 1,000,000,000,000,000,000 bytes or 1018 bytes).
Considering that was over four years ago, and the estimation that 5 Exabytes represents "All words ever spoken by human beings," and one quickly realizes we are at a point where we are collectively cranking out one heck of a lot of data. To put the above into perspective, the entire print collection of the U.S. Library of Congress only amounts to a paltry 10 Terabytes (a Terabyte is 1,000,000,000,000 bytes or 1012 bytes).
Just a few orders of magnitude in difference, wouldn't one say? It certainly explains why electronic data discovery (EDD) has grown in leaps and bounds. In the most simplistic terms, I consider paper and electronic evidence to make up the two basic parts of the iceberg: Paper represents the tiny tip that is visible in comparison to the overwhelming mass that lies hidden under the surface. Let's extend this analogy a bit further: Attorneys still working predominantly with paper-based discovery remind me of the captain of the Titanic, under the belief that his ship was unsinkable because he would be able to see the icebergs in sufficient time to avoid them.
While there are certainly tech-savvy attorneys available who are ready and will rise to the challenge, in my humble opinion there are many more who are not. This may be difficult for some to accept, but there is still hope. In many cases, these latter attorneys are going to need someone to help them muddle through. This is where I see much opportunity for quality EDD consultants to fill this gap.
While many EDD "vendors" will readily collect the data, there is often far too much for "mere mortals" to wade through and still meet litigation and transaction deadlines. It's the worst kept secret among litigation support professionals as to how many times they've seen electronic evidence get blown back to paper, which diminishes much of their usefulness and portability. A true consultant will find savvy ways to separate the wheat from the chaff and present it in ways which work well with legal professionals and their clients (with the latter, the cost issues alone are huge barriers to overcome). In other words, finding the right huge pile of data is one thing, but digging in the right place with the right shovel is quite another.
February 06, 2004
ABA Task Force Offers E-Discovery Standards Draft
Today's ABA Journal eReport provides an update on the ABA's efforts to assist in developing standards for electronic discovery. The ABA Litigation Sectionís Task Force on Electronic Discovery has proposed amendments to the ABA Civil Discovery Standards addressing electronic discovery, and is taking comments on these draft amendments.
According to the eReport article, "[t]he draft was designed to address three primary issues: allocating the cost of electronic discovery, altering or destroying evidence, and handling privileged information. Standards exist for such issues in the paper world, but there are new issues associated with electronic evidence."
The task force has proposed five standards, which are summarized in the article. The thrust behind this effort is that technology changes faster than the ability of the system to update the rules accordingly (which is also the subject of one of the proposed standards regarding storage medium). For attorneys looking to get their feet wet in ED, the proposed standards offer a nice checklist of the types of data involved, cost-sharing issues, and more. For more experienced cyber-litigators, it's helpful to see which way the wind is blowing as the profession attempts to address these issues. These proposed standards may indeed assist in forming the basis of Electronic Discovery Best Practices.
February 03, 2004
ZDNet's Editors' Top Notebooks
In the market for a new notebook computer, or just Windows shopping? Here's a good starting point to survey the landscape.
ZDNet just posted their favorite notebook computers in the following categories, along with the editors' rating, notebook price, and feature summary:
Further demonstrating the arrival of the Tablet PCs for business use, several are mentioned among the business notebooks. Most still command a noticeable premium over their conventional siblings.
February 02, 2004
E-Gads -- New Pentiums add E-GHz
Today, Intel is bringing out five new Pentium chips with a sixth due out later this quarter, according to ZDNet:
"The new crop of Pentium 4s, which will spawn a number of new desktop PC models, will include three chips based on a fresh processor design, code-named Prescott. Intel will add two new speed versions of its current Pentium 4, dubbed Northwood. A sixth Prescott Pentium 4, running at 3.4GHz will be announced Monday, but it won't be available until later in the quarter.
Now, did everyone catch all the technolingo? Pop-quiz on each new chip in five minutes! ;^)
The Wizard of OS: Should You Upgrade?
Microsoft Windows 9x (95, 98, ME, and their sub-versions) have been around for a long time. Probably the most successful, and relatively stable of the bunch was Windows 98 Second Edition. In business and home use, it is arguably the best of the 9x series. However, it's over five years old, and in computer doggie years, that makes it fairly ancient. Add to the fact that Microsoft has already begun phasing out support for Windows 98, and it raises some tough questions.
However, should you upgrade all of your computers to Windows XP? The answer to this loaded question is the typical lawyer's response: It depends.
An Operating System (OS) upgrade cannot be considered in a vacuum. There are practical hardware requirements, software application considerations, and the fact that OS upgrades often are the tail that wags the IT budget dog. If you upgrade an OS to one that is several generations newer, oftentimes it will cost much more than the OS upgrade to get new hardware capable of running it, plus obtaining new programs, peripherals, and their associated drivers that will work with the new OS.
Here to help answer that question are two good TechRepublic articles which nicely illustrate both sides of the coin:
For business use, I still lean towards Windows XP upgrades because of the enhanced stability, and the fact that you are buying into longer overall support -- Windows XP won't be replaced until Longhorn comes out in 2006, so there's virtually no chance Microsoft is going to phase out support for Windows XP any time soon. This also means that software vendors will need to support it for some time yet. Also, if you are considering doing any type of networking between two or more computers, whether it's wired Ethernet or Wi-Fi, I strongly recommend paying the extra money for Windows XP Professional over the Home edition. The difference is usually less than $100 and the enhanced networking support will more than pay itself back in saved time.
However, if you have critical legacy software or hardware that you must continue to use for whatever reason, then an XP upgrade may generate some serious challenges that requires savvy planning and implementation talent. If your application program versions are mostly up-to-date, then an XP upgrade may not be too painful overall. Also, if you're still running Windows NT, run, don't walk, to upgrade to Windows XP if you have sufficient hardware. NT's architecture was fairly unique and clunky, and it lacks support for any USB device, so many software and hardware developers have already dropped it from their supported OS list. If you're running Windows 2000, it's still a good operating system that has much in common with Windows XP under the hood, and will likely to be supported for a while yet.
Windows 98 SE is still a good home and SOHO computer OS for people with older hardware. Let's face it, if most of your tasks are related to running Corel WordPerfect or Microsoft Office coupled with Internet access (most notably, web surfing and e-mail), some light printing and graphical tasks, and some miscellaneous fun, then you don't need a lot of CPU power to get by for a little while longer. Windows XP will choke on some slower processors with less memory installed.
The main concern I have at this time is that if you need to buy programs that are updated annually, you may get a surprise to find that Windows 98 was quietly and suddenly dropped from its supported list. Windows 95 and even NT have already dropped off a lot of software vendors lists. So Windows 98 is definitely next, despite its huge popularity. However, as good as Windows 98 was, in terms of stability, it can't hold a candle to Windows XP. In Win98, when one program crashes, it can still take your entire computing session and unsaved work with it.
Lest one wonders why on earth I would be discussing Windows 98 when most new computers have been shipping with either Windows 2000 or XP for several years now: I still know a lot of people who run Windows 98 on their home PC, as well as solo and small firm practitioners who are just trying to get by with what they have. Hopefully, this post will help them determine where they want to go both today and tomorrow.